We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I was deploying your model on COLAB - with the repo copied, I ran the code below:
!python main.py --mode train --attention_len 16 --batch_size 32 --data_set muse --dropout 0.2 --learning_rate 1e-5 --model_dir ./models/model --num_epochs 40 --num_layers 3 --num_units 338
Everything seems to be fine at that point, but it's stuck while training - I don't know why.T_T https://colab.research.google.com/drive/1jaHTWg637wkD35C5LC5rRjej_G95L6tV?usp=sharing
The text was updated successfully, but these errors were encountered:
I was deploying your model on COLAB - with the repo copied, I ran the code below: !python main.py --mode train --attention_len 16 --batch_size 32 --data_set muse --dropout 0.2 --learning_rate 1e-5 --model_dir ./models/model --num_epochs 40 --num_layers 3 --num_units 338 Everything seems to be fine at that point, but it's stuck while training - I don't know why.T_T https://colab.research.google.com/drive/1jaHTWg637wkD35C5LC5rRjej_G95L6tV?usp=sharing
Hi, I have the same problem, have you found a solution?
Sorry, something went wrong.
No branches or pull requests
I was deploying your model on COLAB - with the repo copied, I ran the code below:
!python main.py --mode train
--attention_len 16
--batch_size 32
--data_set muse
--dropout 0.2
--learning_rate 1e-5
--model_dir ./models/model
--num_epochs 40
--num_layers 3
--num_units 338
Everything seems to be fine at that point, but it's stuck while training - I don't know why.T_T
https://colab.research.google.com/drive/1jaHTWg637wkD35C5LC5rRjej_G95L6tV?usp=sharing
The text was updated successfully, but these errors were encountered: