You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
raise NotImplementedError(msg)
NotImplementedError: No operator found for memory_efficient_attention_forward with inputs:
query : shape=(40, 1564, 1, 64) (torch.float32)
key : shape=(40, 1564, 1, 64) (torch.float32)
value : shape=(40, 1564, 1, 64) (torch.float32)
attn_bias : <class 'NoneType'>
p : 0.0 [email protected] is not supported because:
xFormers wasn't build with CUDA support
requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old)
dtype=torch.float32 (supported: {torch.bfloat16, torch.float16})
operator wasn't built - see python -m xformers.info for more info cutlassF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
这是GPU不够吗
The text was updated successfully, but these errors were encountered:
raise NotImplementedError(msg)
NotImplementedError: No operator found for
memory_efficient_attention_forward
with inputs:query : shape=(40, 1564, 1, 64) (torch.float32)
key : shape=(40, 1564, 1, 64) (torch.float32)
value : shape=(40, 1564, 1, 64) (torch.float32)
attn_bias : <class 'NoneType'>
p : 0.0
[email protected]
is not supported because:xFormers wasn't build with CUDA support
requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old)
dtype=torch.float32 (supported: {torch.bfloat16, torch.float16})
operator wasn't built - see
python -m xformers.info
for more infocutlassF
is not supported because:xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info
for more info这是GPU不够吗
The text was updated successfully, but these errors were encountered: