Skip to content

Commit 3601e9c

Browse files
authored
The code was using the learning_rate from optimizer.pth after starting self critical training.
1 parent 275e22c commit 3601e9c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

train.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,9 +89,9 @@ def train(opt):
8989
frac = (epoch - opt.learning_rate_decay_start) // opt.learning_rate_decay_every
9090
decay_factor = opt.learning_rate_decay_rate ** frac
9191
opt.current_lr = opt.learning_rate * decay_factor
92-
utils.set_lr(optimizer, opt.current_lr) # set the decayed rate
9392
else:
9493
opt.current_lr = opt.learning_rate
94+
utils.set_lr(optimizer, opt.current_lr)
9595
# Assign the scheduled sampling prob
9696
if epoch > opt.scheduled_sampling_start and opt.scheduled_sampling_start >= 0:
9797
frac = (epoch - opt.scheduled_sampling_start) // opt.scheduled_sampling_increase_every

0 commit comments

Comments
 (0)