You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an AMD card and dual boot Kubuntu specifically to use AI workloads with ROCm like Stable Diffusion and LLMs. I've tried pretty hard to make MagicQuill compatible with ROCm, but it's... Way too much work for me. I'm smart, but... Not that smart lol.
Would it be possible for you to make it compatible with the ROCm stack? We have a version of BitsAndBytes for ROCm now directly from the ROCm Github, but no matter what I did, altered, edited, I couldn't get it to work lol. I managed to get it working by replacing the quantization in LLaVA to bFloat16, but that took up too much of my 16GB of VRAM, so while it worked, it definitely needed quantized to at least float8e4f3n(or whatever it is... idk too many characters to remember.)
I've tried both the Standalone of MagicQuill and the ComfyUI Node.
The text was updated successfully, but these errors were encountered:
I have an AMD card and dual boot Kubuntu specifically to use AI workloads with ROCm like Stable Diffusion and LLMs. I've tried pretty hard to make MagicQuill compatible with ROCm, but it's... Way too much work for me. I'm smart, but... Not that smart lol.
Would it be possible for you to make it compatible with the ROCm stack? We have a version of BitsAndBytes for ROCm now directly from the ROCm Github, but no matter what I did, altered, edited, I couldn't get it to work lol. I managed to get it working by replacing the quantization in LLaVA to bFloat16, but that took up too much of my 16GB of VRAM, so while it worked, it definitely needed quantized to at least float8e4f3n(or whatever it is... idk too many characters to remember.)
I've tried both the Standalone of MagicQuill and the ComfyUI Node.
The text was updated successfully, but these errors were encountered: