-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
what is the learning rate when start self-critical training? #26
Comments
The learning rate will be updated according to the argument. |
Thank you. But I am still puzzled which learning rate are used in self-critical, the 5e-5 or the one stored in optimizer.pth. |
I take it back. After reviewing my code, you are correct. This is definitely a bug. The learning rate will be what is saved in the optimizer. |
I fix it in the latest commit. Thank you for pointing this out. |
ruotianluo
added a commit
that referenced
this issue
Feb 14, 2019
…tical_bottom_up * commit '510b0e02d1fcf43a5281bebb147ca1bce5db45f1': Fix a typo in FCModel _sample Remove att2in dependency and fix typo Fix #26. fix bug when lang_stats not set to 1 Only initialize cider_score at the first time.
linzhlalala
pushed a commit
to linzhlalala/self-critical.pytorch
that referenced
this issue
Feb 23, 2021
The code was using the learning_rate from optimizer.pth after starting self critical training.
linzhlalala
pushed a commit
to linzhlalala/self-critical.pytorch
that referenced
this issue
Feb 23, 2021
…tical_bottom_up * commit '510b0e02d1fcf43a5281bebb147ca1bce5db45f1': Fix a typo in FCModel _sample Remove att2in dependency and fix typo Fix ruotianluo#26. fix bug when lang_stats not set to 1 Only initialize cider_score at the first time.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, ruotian. Thank you for your job. I am a bit confused about the learning rate (5e-5) for self-critical training.
First, I train without self-critical, set
--learning_rate 5e-4 --learning_rate_decay_start 0 --scheduled_sampling_start 0 --max_epochs 30
as you write. Then after 30 epochs I train using self-critical with the setting--learning_rate 5e-5 --start_from log_fc_rl
, nolearning_rate_decay_start
andscheduled_sampling_start
.In the train.py, I find that
self-critical.pytorch/train.py
Line 82 in 275e22c
if vars(opt).get('start_from', None) is not None and os.path.isfile(os.path.join(opt.start_from,"optimizer.pth")): optimizer.load_state_dict(torch.load(os.path.join(opt.start_from, 'optimizer.pth')))
.So does this mean that the learning_rate 5e-5 are discarded, and it's unnecessary for us to set this parameter when we train using self-critical?
The text was updated successfully, but these errors were encountered: