-
Notifications
You must be signed in to change notification settings - Fork 22
Flux ‐ GGUF and unet safetensors
Ruined Fooocus has support for quantized GGUF Flux models that you can find here city96/FLUX.1-dev-gguf and city96/FLUX.1-schnell-gguf, and some of the Flux models found on CivitAI that only contain the Unet part.
Since these are missing clip, t5 and the vae, you might need to download these. This should be done automatically but you can do it manually if you want to use other models:
-
comfyanonymous/flux_text_encoders - clip_l.safetensors, place in
models\clip
-
city96/t5-v1_1-xxl-encoder-gguf - t5-v1_1-xxl-encoder-Q3_K_S.gguf ¹, put in
models\clip
-
black-forest-labs/FLUX.1-schnell - ae.safetensors, in
models\vae
. This one will work for both Dev and Schnell
¹ t5-v1_1-xxl-encoder-Q3_K_S.gguf is the smallest and the one used as default. You can change any of these by editing settings\settings.json
Example:
"gguf_clip1": "flux_clip_l.safetensors",
"gguf_clip2": "t5-v1_1-xxl-encoder-Q6_K.gguf",
"gguf_vae": "ae.safetensors"
(Make sure you don't misplace the commas at the end of the rows.)
RuinedFooocus can automatically download some files. The list of known files are:
For gguf_clip1:
- clip_l.safetensors
For gguf_clip2:
- t5-v1_1-xxl-encoder-Q3_K_L.gguf
- t5-v1_1-xxl-encoder-Q3_K_M.gguf
- t5-v1_1-xxl-encoder-Q3_K_S.gguf
- t5-v1_1-xxl-encoder-Q4_K_M.gguf
- t5-v1_1-xxl-encoder-Q4_K_S.gguf
- t5-v1_1-xxl-encoder-Q5_K_M.gguf
- t5-v1_1-xxl-encoder-Q5_K_S.gguf
- t5-v1_1-xxl-encoder-Q6_K.gguf
- t5-v1_1-xxl-encoder-Q8_0.gguf
- t5-v1_1-xxl-encoder-f16.gguf
- t5-v1_1-xxl-encoder-f32.gguf
For gguf_vae:
- ae.safetensors
You should now be able to use GGUF models and Flux safetensors that are missing clip, t5 and vae.
Some models that should work:
- city96/FLUX.1-dev-gguf - Any of these
- city96/FLUX.1-schnell-gguf - Any of these.
- https://civitai.com/models/647237/flux1-dev-gguf-q2k-q3ks-q4q41q4ks-q5q51q5ks-q6k-q8
- https://civitai.com/models/648580/flux1-schnell-gguf-q2k-q3ks-q4q41q4ks-q5q51-q5ks-q6k-q8
There are also models that contain everything and will work out-of-the-box:
- A collection of models in this discussion.
Note that Flux models needs other Performance settings than SDXL. You can set these by selecting Custom...
as performance. These are two examples that work "ok" and can be a good starting point.