You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Model name './torch_unilm_model/' not found in model shortcut name list (unilm-large-cased, unilm-base-cased). Assuming './torch_unilm_model/' is a path, a model identifier, or url to a directory containing tokenizer files.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/added_tokens.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/special_tokens_map.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/tokenizer_config.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file ./torch_unilm_model/vocab.txt
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
Loading Train Dataset ./dataset/
Segmentation fault (core dumped)`
The text was updated successfully, but these errors were encountered:
`CUDA_VISIBLE_DEVICES=0 python3 run_seq2seq.py --data_dir ./dataset/ --src_file train.tsv --model_type unilm --model_name_or_path ./torch_unilm_model/ --output_dir ./output_dir/ --max_seq_length 512 --max_position_embeddings 512 --do_train --do_lower_case --train_batch_size 32 --learning_rate 1e-5 --num_train_epochs 3
09/26/2020 16:10:34 - INFO - main - device: cuda n_gpu: 1, distributed training: False, 16-bits training: False
09/26/2020 16:10:34 - INFO - transformers.configuration_utils - loading configuration file ./torch_unilm_model/config.json
09/26/2020 16:10:34 - INFO - transformers.configuration_utils - Model config UnilmConfig {
"_num_labels": 2,
"architectures": null,
"attention_probs_dropout_prob": 0.1,
"bos_token_id": null,
"directionality": "bidi",
"do_sample": false,
"early_stopping": false,
"eos_token_id": null,
"finetuning_task": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"id2label": {
"0": "LABEL_0",
"1": "LABEL_1"
},
"initializer_range": 0.02,
"intermediate_size": 3072,
"is_decoder": false,
"is_encoder_decoder": false,
"label2id": {
"LABEL_0": 0,
"LABEL_1": 1
},
"layer_norm_eps": 1e-12,
"length_penalty": 1.0,
"max_length": 20,
"max_position_embeddings": 512,
"min_length": 0,
"model_type": "",
"no_repeat_ngram_size": 0,
"num_attention_heads": 12,
"num_beams": 1,
"num_hidden_layers": 12,
"num_return_sequences": 1,
"output_attentions": false,
"output_hidden_states": false,
"output_past": true,
"pad_token_id": null,
"pooler_fc_size": 768,
"pooler_num_attention_heads": 12,
"pooler_num_fc_layers": 3,
"pooler_size_per_head": 128,
"pooler_type": "first_token_transform",
"pruned_heads": {},
"repetition_penalty": 1.0,
"temperature": 1.0,
"top_k": 50,
"top_p": 1.0,
"torchscript": false,
"type_vocab_size": 6,
"use_bfloat16": false,
"vocab_size": 21128
}
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Model name './torch_unilm_model/' not found in model shortcut name list (unilm-large-cased, unilm-base-cased). Assuming './torch_unilm_model/' is a path, a model identifier, or url to a directory containing tokenizer files.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/added_tokens.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/special_tokens_map.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - Didn't find file ./torch_unilm_model/tokenizer_config.json. We won't load it.
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file ./torch_unilm_model/vocab.txt
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
09/26/2020 16:10:34 - INFO - transformers.tokenization_utils - loading file None
Loading Train Dataset ./dataset/
Segmentation fault (core dumped)`
The text was updated successfully, but these errors were encountered: