Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support EMA for fp32/fp64 #60

Merged
merged 1 commit into from
Jul 17, 2024
Merged

support EMA for fp32/fp64 #60

merged 1 commit into from
Jul 17, 2024

Conversation

leasunhy
Copy link
Collaborator

No description provided.

@leasunhy leasunhy requested a review from guolinke June 28, 2024 09:47
else:
with torch.no_grad():
for n, p in new_param:
if n in self.name2param:
Copy link
Member

@guolinke guolinke Jul 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what if n is not in "n in self.name2param:" ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess parameters not in self.name2param are those with requires_grad=False

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can also implement this logic in the trainer, by filtering the model.named_parameters(). Which implementation is better?

@guolinke guolinke merged commit d7307e0 into main Jul 17, 2024
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants