-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the logprobs in line 108 of the ShowTellModel? #1
Comments
It should be correct. What error did you get. |
You're right. It is defined at every step. The code checker alerts that something is wrong here since it is not defined ahead(maybe it is not allowed in python3). I am clear about this matter. Thanks for your reply. |
Closed
ruotianluo
added a commit
that referenced
this issue
Apr 9, 2019
* transformer: Further towards AttModel: remove the use of get_logprobs clean up the code, and make transformer as a subclass of AttModel. Fix #1 Update readme for transformer [0.5]Match the new arange behaior. Add reduce on plateau. Use the original options to represent hyperparameters in transofmer Update to previous framework of sample and sample_beam uncomment the original sample. formatting and remove variable. Add noamopt options. Add transformer # Conflicts: # misc/utils.py # models/__init__.py # train.py
ruotianluo
added a commit
that referenced
this issue
Apr 10, 2019
* transformer: Further towards AttModel: remove the use of get_logprobs clean up the code, and make transformer as a subclass of AttModel. Fix #1 Update readme for transformer [0.5]Match the new arange behaior. Add reduce on plateau. Use the original options to represent hyperparameters in transofmer Update to previous framework of sample and sample_beam uncomment the original sample. formatting and remove variable. Add noamopt options. Add transformer # Conflicts: # misc/utils.py # models/__init__.py # train.py
ruotianluo
added a commit
that referenced
this issue
Apr 11, 2019
* transformer: Further towards AttModel: remove the use of get_logprobs clean up the code, and make transformer as a subclass of AttModel. Fix #1 Update readme for transformer [0.5]Match the new arange behaior. Add reduce on plateau. Use the original options to represent hyperparameters in transofmer Update to previous framework of sample and sample_beam uncomment the original sample. formatting and remove variable. Add noamopt options. Add transformer # Conflicts: # misc/utils.py # models/__init__.py # train.py
linzhlalala
pushed a commit
to linzhlalala/self-critical.pytorch
that referenced
this issue
Feb 23, 2021
linzhlalala
pushed a commit
to linzhlalala/self-critical.pytorch
that referenced
this issue
Feb 23, 2021
* transformer: Further towards AttModel: remove the use of get_logprobs clean up the code, and make transformer as a subclass of AttModel. Fix ruotianluo#1 Update readme for transformer [0.5]Match the new arange behaior. Add reduce on plateau. Use the original options to represent hyperparameters in transofmer Update to previous framework of sample and sample_beam uncomment the original sample. formatting and remove variable. Add noamopt options. Add transformer # Conflicts: # misc/utils.py # models/__init__.py # train.py
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, @ruotianluo
I got stuck in this line since I cannot find the logprobs neither in
torch.autograd
nor the current file.Is it a typo or am I wrong?
Best regards,
Lerner
The text was updated successfully, but these errors were encountered: