Skip to content

Commit

Permalink
Don't use is_bf16_supported to check for fp16 support.
Browse files Browse the repository at this point in the history
  • Loading branch information
comfyanonymous committed Feb 5, 2024
1 parent 24129d7 commit 66e28ef
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion comfy/model_management.py
Original file line number Diff line number Diff line change
Expand Up @@ -722,10 +722,13 @@ def should_use_fp16(device=None, model_params=0, prioritize_performance=True, ma
if is_intel_xpu():
return True

if torch.cuda.is_bf16_supported():
if torch.version.hip:
return True

props = torch.cuda.get_device_properties("cuda")
if props.major >= 8:
return True

if props.major < 6:
return False

Expand Down

0 comments on commit 66e28ef

Please sign in to comment.