Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adjust var name to not override builtin #975

Closed
wants to merge 1 commit into from

Conversation

regmibijay
Copy link

I confirm that this contribution is made under the terms of the license found in the root directory of this repository's source tree and that I have the authority necessary to make this contribution on behalf of its copyright owner.

Currently the code block in provided notebook

if torch.cuda.is_available():
    device = torch.device("cuda")
    type = torch.bfloat16
elif torch.backends.mps.is_available():
    device = torch.device("mps")
    type = torch.float32
else:
    device = torch.device("cpu")
    type = torch.float32

overrides builtin function type. This PR renames the portion to use dtype instead of type to address this issue.

@thomasht86
Copy link
Collaborator

As you can see in #979, we renamed this variable in several notebooks. Thanks a lot for reporting. :)

@thomasht86 thomasht86 closed this Nov 22, 2024
@regmibijay regmibijay deleted the feat/rename_type branch November 22, 2024 13:36
@regmibijay
Copy link
Author

thanks for the heads up Thomas. Awesome stuff.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants