Update README.md
Browse files
README.md
CHANGED
@@ -5,6 +5,20 @@ base_model_relation: quantized
|
|
5 |
tags:
|
6 |
- quantization
|
7 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
# Usage with Diffusers
|
9 |
|
10 |
To use this quantized FLUX.1 [dev] checkpoint, you need to install the 🧨 diffusers and bitsandbytes library:
|
|
|
5 |
tags:
|
6 |
- quantization
|
7 |
---
|
8 |
+
|
9 |
+
|
10 |
+
# Visual comparison of Flux-dev model outputs using BF16 and BnB 8-bit quantization
|
11 |
+
|
12 |
+
<td style="text-align: center;">
|
13 |
+
BF16<br>
|
14 |
+
<medium-zoom background="rgba(0,0,0,.7)"><img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/quantization-backends-diffusers/combined_flux-dev_bf16_combined.png" alt="Flux-dev output with BF16: Baroque, Futurist, Noir styles"></medium-zoom>
|
15 |
+
</td>
|
16 |
+
<td style="text-align: center;">
|
17 |
+
BnB 4-bit<br>
|
18 |
+
<medium-zoom background="rgba(0,0,0,.7)"><img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/quantization-backends-diffusers/combined_flux-dev_bnb_4bit_combined.png" alt="Flux-dev output with BnB 4-bit: Baroque, Futurist, Noir styles"></medium-zoom>
|
19 |
+
</td>
|
20 |
+
|
21 |
+
|
22 |
# Usage with Diffusers
|
23 |
|
24 |
To use this quantized FLUX.1 [dev] checkpoint, you need to install the 🧨 diffusers and bitsandbytes library:
|