You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
=================================== warnings summary ====================================
/home/andreas/anaconda3/envs/pytorchbert/lib/python3.6/site-packages/_pytest/mark/structures.py:337
/home/andreas/anaconda3/envs/pytorchbert/lib/python3.6/site-packages/_pytest/mark/structures.py:337: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
PytestUnknownMarkWarning,
At this point the script is stucked.
Once I managed to ctrl-c twice and got this error:
threading.py", line 1048, in _wait_for_tstate_lock elif lock.acquire(block, timeout):
I should mention, that I am usually a windows user and just installed ubuntu to practice machine learning
Best regards
Andreas
The text was updated successfully, but these errors were encountered:
Hi there!
I am stuck since days.
ubuntu 19.04 (tried 18.04 also)
NVIDIA-SMI 418.74 Driver Version: 418.74
nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2018 NVIDIA Corporation
Built on Sat_Aug_25_21:08:01_CDT_2018
Cuda compilation tools, release 10.0, V10.0.130
Anaconda
tried Python 3.7 and now 3.6 (update: and 3.5 don't work also)
tried WITH APEX and now without
conda list
packages in environment at /home/andreas/anaconda3/envs/pytorchbert:
Name Version Build Channel
atomicwrites 1.3.0 pypi_0 pypi
attrs 19.1.0 pypi_0 pypi
blas 1.0 mkl
blis 0.2.4 pypi_0 pypi
boto3 1.9.162 pypi_0 pypi
botocore 1.12.162 pypi_0 pypi
bzip2 1.0.6 h14c3975_5
ca-certificates 2019.5.15 0
certifi 2019.3.9 py36_0
cffi 1.12.3 py36h2e261b9_0
chardet 3.0.4 pypi_0 pypi
cmake 3.14.0 h52cb24c_0
cudatoolkit 10.0.130 0
cudnn 7.6.0 cuda10.0_0 anaconda
cymem 2.0.2 pypi_0 pypi
docutils 0.14 pypi_0 pypi
en-core-web-sm 2.1.0 pypi_0 pypi
expat 2.2.6 he6710b0_0
freetype 2.9.1 h8a8886c_1
ftfy 5.5.1 pypi_0 pypi
google-pasta 0.1.7 pypi_0 pypi
idna 2.8 pypi_0 pypi
importlib-metadata 0.17 pypi_0 pypi
intel-openmp 2019.3 199
jmespath 0.9.4 pypi_0 pypi
joblib 0.13.2 pypi_0 pypi
jpeg 9b h024ee3a_2
jsonschema 3.0.1 pypi_0 pypi
krb5 1.16.1 h173b8e3_7
libcurl 7.64.1 h20c2e04_0
libedit 3.1.20181209 hc058e9b_0
libffi 3.2.1 hd88cf55_4
libgcc-ng 8.2.0 hdf63c60_1
libgfortran-ng 7.3.0 hdf63c60_0
libpng 1.6.37 hbc83047_0
libssh2 1.8.2 h1ba5d50_0
libstdcxx-ng 8.2.0 hdf63c60_1
libtiff 4.0.10 h2733197_2
mkl 2019.3 199
mkl-include 2019.3 199
mkl_fft 1.0.12 py36ha843d7b_0
mkl_random 1.0.2 py36hd81dba3_0
more-itertools 7.0.0 pypi_0 pypi
murmurhash 1.0.2 pypi_0 pypi
ncurses 6.1 he6710b0_1
ninja 1.9.0 py36hfd86e86_0
numpy 1.16.4 py36h7e9f1db_0
numpy-base 1.16.4 py36hde5b4d6_0
olefile 0.46 py36_0
openssl 1.1.1c h7b6447c_1
packaging 19.0 pypi_0 pypi
pandas 0.24.2 py36he6710b0_0
pillow 6.0.0 py36h34e0f95_0
pip 19.1.1 py36_0
plac 0.9.6 pypi_0 pypi
pluggy 0.12.0 pypi_0 pypi
preshed 2.0.1 pypi_0 pypi
py 1.8.0 pypi_0 pypi
pycparser 2.19 py36_0
pyparsing 2.4.0 pypi_0 pypi
pyrsistent 0.15.2 pypi_0 pypi
pytest 4.6.2 pypi_0 pypi
python 3.6.8 h0371630_0
python-dateutil 2.8.0 py36_0
pytorch 1.1.0 py3.6_cuda10.0.130_cudnn7.5.1_0 pytorch
pytz 2019.1 py_0
readline 7.0 h7b6447c_5
regex 2019.6.5 pypi_0 pypi
requests 2.22.0 pypi_0 pypi
rhash 1.3.8 h1ba5d50_0
s3transfer 0.2.1 pypi_0 pypi
scikit-learn 0.21.2 pypi_0 pypi
scipy 1.2.1 py36h7c811a0_0
setuptools 41.0.1 py36_0
six 1.12.0 py36_0
sklearn 0.0 pypi_0 pypi
spacy 2.1.4 pypi_0 pypi
sqlite 3.28.0 h7b6447c_0
srsly 0.0.5 pypi_0 pypi
tb-nightly 1.14.0a20190605 pypi_0 pypi
tf-estimator-nightly 1.14.0.dev2019060601 pypi_0 pypi
tf-nightly-gpu 1.14.1.dev20190606 pypi_0 pypi
thinc 7.0.4 pypi_0 pypi
tk 8.6.8 hbc83047_0
torch 1.1.0 pypi_0 pypi
torchvision 0.3.0 py36_cu10.0.130_1 pytorch
tqdm 4.32.1 pypi_0 pypi
urllib3 1.25.3 pypi_0 pypi
wasabi 0.2.2 pypi_0 pypi
wcwidth 0.1.7 pypi_0 pypi
wheel 0.33.4 py36_0
wrapt 1.11.1 pypi_0 pypi
xz 5.2.4 h14c3975_4
yaml 0.1.7 had09818_2
zipp 0.5.1 pypi_0 pypi
zlib 1.2.11 h7b6447c_3
zstd 1.3.7 h0b5b093_0
Failed test:
python -m pytest -sv tests/
tests/modeling_gpt2_test.py::GPT2ModelTest::test_config_to_json_file PASSED
tests/modeling_gpt2_test.py::GPT2ModelTest::test_config_to_json_string PASSED
tests/modeling_gpt2_test.py::GPT2ModelTest::test_default PASSED
tests/modeling_gpt2_test.py::GPT2ModelTest::test_model_from_pretrained SKIPPED
tests/modeling_openai_test.py::OpenAIGPTModelTest::test_config_to_json_file PASSED
tests/modeling_openai_test.py::OpenAIGPTModelTest::test_config_to_json_string PASSED
tests/modeling_openai_test.py::OpenAIGPTModelTest::test_default PASSED
tests/modeling_openai_test.py::OpenAIGPTModelTest::test_model_from_pretrained SKIPPED
tests/modeling_test.py::BertModelTest::test_config_to_json_file PASSED
tests/modeling_test.py::BertModelTest::test_config_to_json_string PASSED
tests/modeling_test.py::BertModelTest::test_default PASSED
tests/modeling_test.py::BertModelTest::test_model_from_pretrained SKIPPED
tests/modeling_transfo_xl_test.py::TransfoXLModelTest::test_config_to_json_file PASSED
tests/modeling_transfo_xl_test.py::TransfoXLModelTest::test_config_to_json_string PASSED
tests/modeling_transfo_xl_test.py::TransfoXLModelTest::test_default PASSED
tests/modeling_transfo_xl_test.py::TransfoXLModelTest::test_model_from_pretrained SKIPPED
tests/optimization_test.py::OptimizationTest::test_adam PASSED
tests/optimization_test.py::ScheduleInitTest::test_bert_sched_init PASSED
tests/optimization_test.py::ScheduleInitTest::test_openai_sched_init PASSED
tests/optimization_test.py::WarmupCosineWithRestartsTest::test_it [0. 0. 0. 0. 0.]
[1. 1. 1. 1. 1.]
PASSED
tests/tokenization_gpt2_test.py::GPT2TokenizationTest::test_full_tokenizer PASSED
100%|███████████████████████████████████████| 1042301/1042301 [00:01<00:00, 741907.79B/s]
100%|█████████████████████████████████████████| 456318/456318 [00:00<00:00, 704099.11B/s]
PASSED
tests/tokenization_openai_test.py::OpenAIGPTTokenizationTest::test_full_tokenizer PASSED
tests/tokenization_openai_test.py::OpenAIGPTTokenizationTest::test_tokenizer_from_pretrained SKIPPED
tests/tokenization_test.py::TokenizationTest::test_basic_tokenizer_lower PASSED
tests/tokenization_test.py::TokenizationTest::test_basic_tokenizer_no_lower PASSED
tests/tokenization_test.py::TokenizationTest::test_chinese PASSED
tests/tokenization_test.py::TokenizationTest::test_full_tokenizer PASSED
tests/tokenization_test.py::TokenizationTest::test_is_control PASSED
tests/tokenization_test.py::TokenizationTest::test_is_punctuation PASSED
tests/tokenization_test.py::TokenizationTest::test_is_whitespace PASSED
tests/tokenization_test.py::TokenizationTest::test_tokenizer_from_pretrained SKIPPED
tests/tokenization_test.py::TokenizationTest::test_wordpiece_tokenizer PASSED
tests/tokenization_transfo_xl_test.py::TransfoXLTokenizationTest::test_full_tokenizer building vocab from /tmp/transfo_xl_tokenizer_test.txt
final vocab size 9
PASSED
tests/tokenization_transfo_xl_test.py::TransfoXLTokenizationTest::test_full_tokenizer_lower PASSED
tests/tokenization_transfo_xl_test.py::TransfoXLTokenizationTest::test_full_tokenizer_no_lower PASSED
tests/tokenization_transfo_xl_test.py::TransfoXLTokenizationTest::test_tokenizer_from_pretrained SKIPPED
=================================== warnings summary ====================================
/home/andreas/anaconda3/envs/pytorchbert/lib/python3.6/site-packages/_pytest/mark/structures.py:337
/home/andreas/anaconda3/envs/pytorchbert/lib/python3.6/site-packages/_pytest/mark/structures.py:337: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
PytestUnknownMarkWarning,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
Used script:
export GLUE_DIR=/data/glue_data
export TASK_NAME=MRPC
python run_classifier.py
--task_name $TASK_NAME
--do_train
--do_eval
--do_lower_case
--data_dir $GLUE_DIR/$TASK_NAME
--bert_model bert-base-uncased
--max_seq_length 128
--train_batch_size 32
--learning_rate 2e-5
--num_train_epochs 3.0
--output_dir /tmp/$TASK_NAME/
06/05/2019 12:06:17 - INFO - main - device: cuda n_gpu: 2, distributed training: False, 16-bits training: False
06/05/2019 12:06:17 - INFO - pytorch_pretrained_bert.tokenization - loading vocabulary file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /home/andreas/.pytorch_pretrained_bert/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
06/05/2019 12:06:17 - INFO - main - LOOKING AT /data/glue_data/MRPC/train.tsv
06/05/2019 12:06:18 - INFO - pytorch_pretrained_bert.modeling - loading archive file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz from cache at /home/andreas/.pytorch_pretrained_bert/distributed_-1/9c41111e2de84547a463fd39217199738d1e3deb72d4fec4399e6e241983c6f0.ae3cef932725ca7a30cdcb93fc6e09150a55e2a130ec7af63975a16c153ae2ba
06/05/2019 12:06:18 - INFO - pytorch_pretrained_bert.modeling - extracting archive file /home/andreas/.pytorch_pretrained_bert/distributed_-1/9c41111e2de84547a463fd39217199738d1e3deb72d4fec4399e6e241983c6f0.ae3cef932725ca7a30cdcb93fc6e09150a55e2a130ec7af63975a16c153ae2ba to temp dir /tmp/tmp_0dlskh7
06/05/2019 12:06:21 - INFO - pytorch_pretrained_bert.modeling - Model config {
"attention_probs_dropout_prob": 0.1,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"max_position_embeddings": 512,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"type_vocab_size": 2,
"vocab_size": 30522
}
06/05/2019 12:06:23 - INFO - pytorch_pretrained_bert.modeling - Weights of BertForSequenceClassification not initialized from pretrained model: ['classifier.weight', 'classifier.bias']
06/05/2019 12:06:23 - INFO - pytorch_pretrained_bert.modeling - Weights from pretrained model not used in BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']
06/05/2019 12:06:26 - INFO - main - Writing example 0 of 3668
06/05/2019 12:06:26 - INFO - main - *** Example ***
06/05/2019 12:06:26 - INFO - main - guid: train-1
06/05/2019 12:06:26 - INFO - main - tokens: [CLS] am ##ro ##zi accused his brother , whom he called " the witness " , of deliberately di ##stor ##ting his evidence . [SEP] referring to him as only " the witness " , am ##ro ##zi accused his brother of deliberately di ##stor ##ting his evidence . [SEP]
06/05/2019 12:06:26 - INFO - main - input_ids: 101 2572 3217 5831 5496 2010 2567 1010 3183 2002 2170 1000 1996 7409 1000 1010 1997 9969 4487 23809 3436 2010 3350 1012 102 7727 2000 2032 2004 2069 1000 1996 7409 1000 1010 2572 3217 5831 5496 2010 2567 1997 9969 4487 23809 3436 2010 3350 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - label: 1 (id = 1)
06/05/2019 12:06:26 - INFO - main - *** Example ***
06/05/2019 12:06:26 - INFO - main - guid: train-2
06/05/2019 12:06:26 - INFO - main - tokens: [CLS] yu ##ca ##ip ##a owned dominic ##k ' s before selling the chain to safe ##way in 1998 for $ 2 . 5 billion . [SEP] yu ##ca ##ip ##a bought dominic ##k ' s in 1995 for $ 69 ##3 million and sold it to safe ##way for $ 1 . 8 billion in 1998 . [SEP]
06/05/2019 12:06:26 - INFO - main - input_ids: 101 9805 3540 11514 2050 3079 11282 2243 1005 1055 2077 4855 1996 4677 2000 3647 4576 1999 2687 2005 1002 1016 1012 1019 4551 1012 102 9805 3540 11514 2050 4149 11282 2243 1005 1055 1999 2786 2005 1002 6353 2509 2454 1998 2853 2009 2000 3647 4576 2005 1002 1015 1012 1022 4551 1999 2687 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - label: 0 (id = 0)
06/05/2019 12:06:26 - INFO - main - *** Example ***
06/05/2019 12:06:26 - INFO - main - guid: train-3
06/05/2019 12:06:26 - INFO - main - tokens: [CLS] they had published an advertisement on the internet on june 10 , offering the cargo for sale , he added . [SEP] on june 10 , the ship ' s owners had published an advertisement on the internet , offering the explosives for sale . [SEP]
06/05/2019 12:06:26 - INFO - main - input_ids: 101 2027 2018 2405 2019 15147 2006 1996 4274 2006 2238 2184 1010 5378 1996 6636 2005 5096 1010 2002 2794 1012 102 2006 2238 2184 1010 1996 2911 1005 1055 5608 2018 2405 2019 15147 2006 1996 4274 1010 5378 1996 14792 2005 5096 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - label: 1 (id = 1)
06/05/2019 12:06:26 - INFO - main - *** Example ***
06/05/2019 12:06:26 - INFO - main - guid: train-4
06/05/2019 12:06:26 - INFO - main - tokens: [CLS] around 03 ##35 gm ##t , tab shares were up 19 cents , or 4 . 4 % , at a $ 4 . 56 , having earlier set a record high of a $ 4 . 57 . [SEP] tab shares jumped 20 cents , or 4 . 6 % , to set a record closing high at a $ 4 . 57 . [SEP]
06/05/2019 12:06:26 - INFO - main - input_ids: 101 2105 6021 19481 13938 2102 1010 21628 6661 2020 2039 2539 16653 1010 2030 1018 1012 1018 1003 1010 2012 1037 1002 1018 1012 5179 1010 2383 3041 2275 1037 2501 2152 1997 1037 1002 1018 1012 5401 1012 102 21628 6661 5598 2322 16653 1010 2030 1018 1012 1020 1003 1010 2000 2275 1037 2501 5494 2152 2012 1037 1002 1018 1012 5401 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - label: 0 (id = 0)
06/05/2019 12:06:26 - INFO - main - *** Example ***
06/05/2019 12:06:26 - INFO - main - guid: train-5
06/05/2019 12:06:26 - INFO - main - tokens: [CLS] the stock rose $ 2 . 11 , or about 11 percent , to close friday at $ 21 . 51 on the new york stock exchange . [SEP] pg & e corp . shares jumped $ 1 . 63 or 8 percent to $ 21 . 03 on the new york stock exchange on friday . [SEP]
06/05/2019 12:06:26 - INFO - main - input_ids: 101 1996 4518 3123 1002 1016 1012 2340 1010 2030 2055 2340 3867 1010 2000 2485 5958 2012 1002 2538 1012 4868 2006 1996 2047 2259 4518 3863 1012 102 18720 1004 1041 13058 1012 6661 5598 1002 1015 1012 6191 2030 1022 3867 2000 1002 2538 1012 6021 2006 1996 2047 2259 4518 3863 2006 5958 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
06/05/2019 12:06:26 - INFO - main - label: 1 (id = 1)
06/05/2019 12:06:28 - INFO - main - ***** Running training *****
06/05/2019 12:06:28 - INFO - main - Num examples = 3668
06/05/2019 12:06:28 - INFO - main - Batch size = 32
06/05/2019 12:06:28 - INFO - main - Num steps = 342
Epoch: 0%| | 0/3 [00:00<?, ?it/s]
At this point the script is stucked.
Once I managed to ctrl-c twice and got this error:
threading.py", line 1048, in _wait_for_tstate_lock elif lock.acquire(block, timeout):
I should mention, that I am usually a windows user and just installed ubuntu to practice machine learning
Best regards
Andreas
The text was updated successfully, but these errors were encountered: