Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can not get start #2

Closed
cmd0714 opened this issue Nov 29, 2021 · 5 comments
Closed

Can not get start #2

cmd0714 opened this issue Nov 29, 2021 · 5 comments

Comments

@cmd0714
Copy link

cmd0714 commented Nov 29, 2021

Downloading: 100%
1.24k/1.24k [00:00<00:00, 24.3kB/s]
Downloading: 100%
2.12G/2.12G [01:17<00:00, 28.3MB/s]
All model checkpoint layers were used when initializing TFPegasusForConditionalGeneration.

All the layers of TFPegasusForConditionalGeneration were initialized from the model checkpoint at human-centered-summarization/financial-summarization-pegasus.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFPegasusForConditionalGeneration for predictions without further training.
Downloading: 100%
1.82M/1.82M [00:00<00:00, 7.27MB/s]
Downloading: 100%
1.31k/1.31k [00:00<00:00, 29.2kB/s]
Downloading: 100%
1.40k/1.40k [00:00<00:00, 29.2kB/s]
max_encoder_position_embeddings: 4096
max_decoder_position_embeddings: 512

TypeError Traceback (most recent call last)
in ()
1 model_name='human-centered-summarization/financial-summarization-pegasus'
----> 2 model,tokenizer=long_.create_long_model(save_model=".\Pegasus\", attention_window=4096, max_pos=4096,model_name=model_name)

3 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in init(self, trainable, name, dtype, dynamic, **kwargs)
339 trainable.dtype is tf.bool)):
340 raise TypeError(
--> 341 'Expected trainable argument to be a boolean, '
342 f'but got: {trainable}')
343 self._trainable = trainable

TypeError: Expected trainable argument to be a boolean, but got: LongformerPegasusConfig {
"_name_or_path": "google/pegasus-xsum",
"activation_dropout": 0.1,
"activation_function": "relu",
"add_bias_logits": false,
"add_final_layer_norm": true,
"architectures": [
"LongformerForPegasus"
],
"attention_dilation": [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
],
"attention_dropout": 0.1,
"attention_mode": "sliding_chunks",
"attention_probs_dropout_prob": 0.1,
"attention_window": [
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096
],
"autoregressive": false,
"bos_token_id": 0,
"classif_dropout": 0.0,
"classifier_dropout": 0.0,
"d_model": 1024,
"decoder_attention_heads": 16,
"decoder_ffn_dim": 4096,
"decoder_layerdrop": 0.0,
"decoder_layers": 16,
"decoder_start_token_id": 0,
"do_blenderbot_90_layernorm": false,
"dropout": 0.1,
"encoder_attention_heads": 16,
"encoder_ffn_dim": 4096,
"encoder_layerdrop": 0.0,
"encoder_layers": 16,
"eos_token_id": 1,
"extra_pos_embeddings": 1,
"force_bos_token_to_be_generated": false,
"forced_eos_token_id": 1,
"gradient_checkpointing": false,
"hidden_dropout_prob": 1e-05,
"id2label": {
"0": "LABEL_0",
"1": "LABEL_1",
"2": "LABEL_2"
},
"init_std": 0.02,
"initializer_range": null,
"is_encoder_decoder": true,
"label2id": {
"LABEL_0": 0,
"LABEL_1": 1,
"LABEL_2": 2
},
"layer_norm_eps": 1e-05,
"length_penalty": 0.6,
"max_decoder_position_embeddings": 512,
"max_encoder_position_embeddings": 4096,
"max_length": 64,
"max_position_embeddings": 512,
"model_type": "pegasus",
"normalize_before": true,
"normalize_embedding": false,
"num_beams": 8,
"num_hidden_layers": 16,
"pad_token_id": 0,
"scale_embedding": true,
"static_position_embeddings": true,
"transformers_version": "4.12.5",
"use_cache": true,
"vocab_size": 96103
}

@abhilash1910
Copy link
Owner

Hi @cmd0714 ,
I think this issue is related to colab only. I am looking into why Colab is causing such an issue ; However you can download it locally (local jupyter notebooks) or in Kaggle notebooks and this should work.

@abhilash1910 abhilash1910 self-assigned this Nov 29, 2021
@abhilash1910 abhilash1910 added bug Something isn't working in-test labels Nov 29, 2021
@abhilash1910
Copy link
Owner

Hi @cmd0714
This issue seems fixed in the latest stable release of 0.3 .A colab demo is provided in the readme (update).
https://github.com/abhilash1910/LongPegasus#samples
Let me know if this gets resolved.

@abhilash1910 abhilash1910 added awaititng-user-response and removed bug Something isn't working labels Nov 30, 2021
@cmd0714
Copy link
Author

cmd0714 commented Dec 1, 2021

Hi @abhilash1910
Appreciate your timely update, I copied your demo code into Colab and the demo can run smoothly, good luck.
Though the summary of the output seems to have a little difference as you showed:
['Hundreds of thousands of people have been affected by the wildfires.']

@abhilash1910
Copy link
Owner

Hi @cmd0714 ,
Yes this happens; the previous model I was using was : google/pegasus-xsum. Also with the same model, there may be different results in some cases (common for all transformer inferences),but the overall intent should remain the same. The larger the model, the better the consistency of the results.

@abhilash1910
Copy link
Owner

Hi @cmd0714 ,
Closing this issue . Kindly reopen if new bugs are found/modifications are required.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants