-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
why should update rollout policy in this way? #29
Comments
I am also wondering why there should be a delay. But to make it the same as that in the paper you can just set the update_rate to 1. |
I also noticed this; the update for the rollout seems to take the form of a convex combination of the parameters from the rollout and the generator. Wonder what's the justification for such an update. |
@eduOS To make it the same as that in the paper, why set the update_rate to 1? Shouldn't it set to be 0? |
@gcbanana You are right. @vanpersie32 I learned that this trick is a regularization method which is the so-called weight decay. I'd like you to see this: #21 |
I had the same question. I don't think that this is weight decay, because it's not being applied to the gradients, and it's not decaying the rollout network's weights towards zero. Rather, it's updating them in a way that maintains an exponential moving average of the generator network weights. I recently found a reinforcement learning paper which did the same thing, in a different context. In their case, they weren't using a rollout network, but the motivation here may be similar. References: |
@lucaslingle You are right. Close the issue |
This is a trick for stabilizing the training process, and setting the parameters of rollout to same with generator will degrade performance of seqgan. |
In face it is the same as L2 regularization. It keeps the weights small and hence stable as stated in other comments. |
According to the paper, rollout policy is the same with generator policy. So self.Wi = self.lstm.Wi, but in the code, here update parameters of rollout policy in a different way. Can you please explain why? Thank you very much @LantaoYu @wnzhang
The text was updated successfully, but these errors were encountered: