Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMD install does not detect Rocm and just errors out asking for CUDA #1058

Open
BrechtCorbeel opened this issue Dec 21, 2024 · 1 comment
Open
Labels
bug Something isn't working

Comments

@BrechtCorbeel
Copy link

What happened?

I switched my 4090 to my Linux server and added my 7900XTX to my main machine as the 4090 just works and runs better, even with the AMD install for Comfy it did not run on Linux cannot get it to run on Windows either, I just get stuck being asked for CUDA:

Traceback (most recent call last):
File "W:\smatrix\Data\Packages\ComfyUIAMD\main.py", line 132, in
import execution
File "W:\smatrix\Data\Packages\ComfyUIAMD\execution.py", line 13, in
import nodes
File "W:\smatrix\Data\Packages\ComfyUIAMD\nodes.py", line 22, in
import comfy.diffusers_load
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\sd.py", line 6, in
from comfy import model_management
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 145, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 114, in get_torch_device
return torch.device(torch.cuda.current_device())
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init_.py", line 878, in current_device
lazy_init()
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init
.py", line 305, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Steps to reproduce

No response

Relevant logs

No response

Version

v 2.14.4

What Operating System are you using?

Windows

@BrechtCorbeel BrechtCorbeel added the bug Something isn't working label Dec 21, 2024
@BrechtCorbeel BrechtCorbeel changed the title AMD install does not detect mROC and just errors out asking for CUDA AMD install does not detect Rocm and just errors out asking for CUDA Dec 21, 2024
@brknsoul
Copy link

ROCm (AMD's version of CUDA) is not available for Windows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants