-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can not get start #2
Comments
Hi @cmd0714 , |
Hi @cmd0714 |
Hi @abhilash1910 |
Hi @cmd0714 , |
Hi @cmd0714 , |
Downloading: 100%
1.24k/1.24k [00:00<00:00, 24.3kB/s]
Downloading: 100%
2.12G/2.12G [01:17<00:00, 28.3MB/s]
All model checkpoint layers were used when initializing TFPegasusForConditionalGeneration.
All the layers of TFPegasusForConditionalGeneration were initialized from the model checkpoint at human-centered-summarization/financial-summarization-pegasus.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFPegasusForConditionalGeneration for predictions without further training.
Downloading: 100%
1.82M/1.82M [00:00<00:00, 7.27MB/s]
Downloading: 100%
1.31k/1.31k [00:00<00:00, 29.2kB/s]
Downloading: 100%
1.40k/1.40k [00:00<00:00, 29.2kB/s]
max_encoder_position_embeddings: 4096
max_decoder_position_embeddings: 512
TypeError Traceback (most recent call last)
in ()
1 model_name='human-centered-summarization/financial-summarization-pegasus'
----> 2 model,tokenizer=long_.create_long_model(save_model=".\Pegasus\", attention_window=4096, max_pos=4096,model_name=model_name)
3 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in init(self, trainable, name, dtype, dynamic, **kwargs)
339 trainable.dtype is tf.bool)):
340 raise TypeError(
--> 341 'Expected
trainable
argument to be a boolean, '342 f'but got: {trainable}')
343 self._trainable = trainable
TypeError: Expected
trainable
argument to be a boolean, but got: LongformerPegasusConfig {"_name_or_path": "google/pegasus-xsum",
"activation_dropout": 0.1,
"activation_function": "relu",
"add_bias_logits": false,
"add_final_layer_norm": true,
"architectures": [
"LongformerForPegasus"
],
"attention_dilation": [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
],
"attention_dropout": 0.1,
"attention_mode": "sliding_chunks",
"attention_probs_dropout_prob": 0.1,
"attention_window": [
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096,
4096
],
"autoregressive": false,
"bos_token_id": 0,
"classif_dropout": 0.0,
"classifier_dropout": 0.0,
"d_model": 1024,
"decoder_attention_heads": 16,
"decoder_ffn_dim": 4096,
"decoder_layerdrop": 0.0,
"decoder_layers": 16,
"decoder_start_token_id": 0,
"do_blenderbot_90_layernorm": false,
"dropout": 0.1,
"encoder_attention_heads": 16,
"encoder_ffn_dim": 4096,
"encoder_layerdrop": 0.0,
"encoder_layers": 16,
"eos_token_id": 1,
"extra_pos_embeddings": 1,
"force_bos_token_to_be_generated": false,
"forced_eos_token_id": 1,
"gradient_checkpointing": false,
"hidden_dropout_prob": 1e-05,
"id2label": {
"0": "LABEL_0",
"1": "LABEL_1",
"2": "LABEL_2"
},
"init_std": 0.02,
"initializer_range": null,
"is_encoder_decoder": true,
"label2id": {
"LABEL_0": 0,
"LABEL_1": 1,
"LABEL_2": 2
},
"layer_norm_eps": 1e-05,
"length_penalty": 0.6,
"max_decoder_position_embeddings": 512,
"max_encoder_position_embeddings": 4096,
"max_length": 64,
"max_position_embeddings": 512,
"model_type": "pegasus",
"normalize_before": true,
"normalize_embedding": false,
"num_beams": 8,
"num_hidden_layers": 16,
"pad_token_id": 0,
"scale_embedding": true,
"static_position_embeddings": true,
"transformers_version": "4.12.5",
"use_cache": true,
"vocab_size": 96103
}
The text was updated successfully, but these errors were encountered: