Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the logprobs in line 108 of the ShowTellModel? #1

Closed
eduOS opened this issue Jun 9, 2017 · 2 comments
Closed

What is the logprobs in line 108 of the ShowTellModel? #1

eduOS opened this issue Jun 9, 2017 · 2 comments

Comments

@eduOS
Copy link

eduOS commented Jun 9, 2017

Hi, @ruotianluo
I got stuck in this line since I cannot find the logprobs neither in torch.autograd nor the current file.

Is it a typo or am I wrong?

Best regards,
Lerner

@ruotianluo
Copy link
Owner

It should be correct. What error did you get.

@eduOS
Copy link
Author

eduOS commented Jun 9, 2017

You're right. It is defined at every step. The code checker alerts that something is wrong here since it is not defined ahead(maybe it is not allowed in python3). I am clear about this matter. Thanks for your reply.

@eduOS eduOS closed this as completed Jun 9, 2017
ruotianluo added a commit that referenced this issue Apr 9, 2019
ruotianluo added a commit that referenced this issue Apr 9, 2019
* transformer:
  Further towards AttModel: remove the use of get_logprobs
  clean up the code, and make transformer as a subclass of AttModel.
  Fix #1
  Update readme for transformer
  [0.5]Match the new arange behaior.
  Add reduce on plateau.
  Use the original options to represent hyperparameters in transofmer
  Update to previous framework of sample and sample_beam
  uncomment the original sample.
  formatting and remove variable.
  Add noamopt options.
  Add transformer

# Conflicts:
#	misc/utils.py
#	models/__init__.py
#	train.py
ruotianluo added a commit that referenced this issue Apr 10, 2019
* transformer:
  Further towards AttModel: remove the use of get_logprobs
  clean up the code, and make transformer as a subclass of AttModel.
  Fix #1
  Update readme for transformer
  [0.5]Match the new arange behaior.
  Add reduce on plateau.
  Use the original options to represent hyperparameters in transofmer
  Update to previous framework of sample and sample_beam
  uncomment the original sample.
  formatting and remove variable.
  Add noamopt options.
  Add transformer

# Conflicts:
#	misc/utils.py
#	models/__init__.py
#	train.py
ruotianluo added a commit that referenced this issue Apr 11, 2019
* transformer:
  Further towards AttModel: remove the use of get_logprobs
  clean up the code, and make transformer as a subclass of AttModel.
  Fix #1
  Update readme for transformer
  [0.5]Match the new arange behaior.
  Add reduce on plateau.
  Use the original options to represent hyperparameters in transofmer
  Update to previous framework of sample and sample_beam
  uncomment the original sample.
  formatting and remove variable.
  Add noamopt options.
  Add transformer

# Conflicts:
#	misc/utils.py
#	models/__init__.py
#	train.py
linzhlalala pushed a commit to linzhlalala/self-critical.pytorch that referenced this issue Feb 23, 2021
linzhlalala pushed a commit to linzhlalala/self-critical.pytorch that referenced this issue Feb 23, 2021
* transformer:
  Further towards AttModel: remove the use of get_logprobs
  clean up the code, and make transformer as a subclass of AttModel.
  Fix ruotianluo#1
  Update readme for transformer
  [0.5]Match the new arange behaior.
  Add reduce on plateau.
  Use the original options to represent hyperparameters in transofmer
  Update to previous framework of sample and sample_beam
  uncomment the original sample.
  formatting and remove variable.
  Add noamopt options.
  Add transformer

# Conflicts:
#	misc/utils.py
#	models/__init__.py
#	train.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants