-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
XLabs Sampler Error on MacOS: Torch not compiled with CUDA enabled #51
Comments
How to fix this ? |
Yes there is a bug on nodes.py line 303 in the sampling method. The main issue is that it's trying to load bf16 support from cuda, but that can't be done on mps. I tried the following steps to fix it:
Apparently Float8_e4m3fn is not supported in MPS but I couldn't change to float32 here. This issue seems to be liked to this other comfyanonymous/ComfyUI#4165 in comfyui. I tried the solutions here, but failed. |
Could change
This is applicable to commit But then, other errors occurred as @ThiagoSousa mentioned above
I have also changed to use Unet loader (GGUF) to load the Flux1 Dev model That would do with these changes. I can run the flow without problem. (Edited to report successful run) |
This made all the xlabs sampler nodes unusable.... |
Is there a way to get this working in Macos? |
Any chance using the Flux1.dev-Q8_0.gguf would work on a MacM1 32GB? I haven't downloaded this model yet. |
Error occurred when executing XlabsSampler: Torch not compiled with CUDA enabled File "/Users/dehengzhou/Desktop/ComfyUI_SDXL/execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "/Users/dehengzhou/Desktop/ComfyUI_SDXL/execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "/Users/dehengzhou/Desktop/ComfyUI_SDXL/execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "/Users/dehengzhou/Desktop/ComfyUI_SDXL/execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) File "/Users/dehengzhou/Desktop/ComfyUI_SDXL/custom_nodes/x-flux-comfyui/nodes.py", line 304, in sampling if torch.cuda.is_bf16_supported(): File "/Users/dehengzhou/miniconda3/lib/python3.10/site-packages/torch/cuda/init.py", line 128, in is_bf16_supported device = torch.cuda.current_device() File "/Users/dehengzhou/miniconda3/lib/python3.10/site-packages/torch/cuda/init.py", line 789, in current_device _lazy_init() File "/Users/dehengzhou/miniconda3/lib/python3.10/site-packages/torch/cuda/init.py", line 284, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") |
Error occurred when executing KSampler: Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype. File "/Users/po/Documents/ComfyUI/execution.py", line 317, in execute |
修改之后还是不能运行,提示同样的错误,要如何解决呢? |
Same here |
same error |
same |
same here |
I've tried all the steps mentioned above, but I'm still getting an error (which I've listed below).
|
OS: MacOS
ComfyUI Version: Newest
x-flux-comfyui Version: Newest
The text was updated successfully, but these errors were encountered: