Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
The code was using the learning_rate from optimizer.pth after starting self critical training.
  • Loading branch information
ruotianluo authored Dec 19, 2017
1 parent 275e22c commit 486dca0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion train.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,9 +89,9 @@ def train(opt):
frac = (epoch - opt.learning_rate_decay_start) // opt.learning_rate_decay_every
decay_factor = opt.learning_rate_decay_rate ** frac
opt.current_lr = opt.learning_rate * decay_factor
utils.set_lr(optimizer, opt.current_lr) # set the decayed rate
else:
opt.current_lr = opt.learning_rate
utils.set_lr(optimizer, opt.current_lr)
# Assign the scheduled sampling prob
if epoch > opt.scheduled_sampling_start and opt.scheduled_sampling_start >= 0:
frac = (epoch - opt.scheduled_sampling_start) // opt.scheduled_sampling_increase_every
Expand Down

0 comments on commit 486dca0

Please sign in to comment.