Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

whisper: address the warning "FP16 is not supported on CPU; using FP32 instead" #629

Closed
ftnext opened this issue Nov 7, 2022 · 1 comment · Fixed by #630
Closed

whisper: address the warning "FP16 is not supported on CPU; using FP32 instead" #629

ftnext opened this issue Nov 7, 2022 · 1 comment · Fixed by #630
Labels
whisper Features related to Whisper

Comments

@ftnext
Copy link
Collaborator

ftnext commented Nov 7, 2022

This issue is to solve #625 (comment)

I want to address the warning

warnings.warn("FP16 is not supported on CPU; using FP32 instead")

The following can help with this warning: openai/whisper#301

@ftnext ftnext added the whisper Features related to Whisper label Nov 7, 2022
@ftnext
Copy link
Collaborator Author

ftnext commented Nov 7, 2022

My idea is to specify fp16=False when GPU is not supported.
I think torch.cuda.is_available() could work as argument for the parameter fp16.

torch.cuda.is_available() fp16
GPU not supported (CPU only) False False
GPU supported True True

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
whisper Features related to Whisper
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant