Dark Beast KLEIN 9b🟦V1.5 BlitZ 02/08/2026
Fine-tunning of black-forest-labs/FLUX.2-klein-9B with BF16\FP8e4m3fn\NVFP4 quantization.
And Merge with @alcaitiff klein-9b-unchained-xxx
This is the ultimate speed-optimized Dark Beast V1 evolution, based on Flux.2 Klein 9B,
engineered specifically for lightning-fast low-step + CFG=1 workflows (5steps).
Also available in NVFP4 quantized format, optimized for acceleration on Blackwell architecture GPUs.
( like RTX50XX, PRO6000, B200, and others )
Also supports non-50 series GPUs (automatic 16-bit operation), Verify environment is my ComfyUI 0.11
Key features:
Fully preserves the signature Dark Beast style, rich details, and intense Black Beast aesthetic from the standard lineage
Refined through advanced targeted distillation & fine-tuning, now perfectly dialed in for zero-CFG guidance at minimal steps
BlitZ-level inference speed — breathtaking high-quality images in just 5 steps ⚡
Recommended settings: 5 steps, CFG=1 (fixed), any seed you want
In one sentence: Taking Klein’s already blazing speed and cranking it to absolute BlitZ velocity while keeping every drop of that ferocious Dark Beast soul! 🟦
Lightning-fast generation awaits — unleash it now! 🚀
Usage:
pip install sdnq
import torch
import diffusers
from sdnq import SDNQConfig # import sdnq to register it into diffusers and transformers
from sdnq.common import use_torch_compile as triton_is_available
from sdnq.loader import apply_sdnq_options_to_model
pipe = diffusers.Flux2KleinPipeline.from_pretrained("GuangyuanSD/FLUX.2-klein-9B-Blitz-Diffusers", torch_dtype=torch.bfloat16)
# Enable INT8 MatMul for AMD, Intel ARC and Nvidia GPUs:
if triton_is_available and (torch.cuda.is_available() or torch.xpu.is_available()):
pipe.transformer = apply_sdnq_options_to_model(pipe.transformer, use_quantized_matmul=True)
pipe.text_encoder = apply_sdnq_options_to_model(pipe.text_encoder, use_quantized_matmul=True)
# pipe.transformer = torch.compile(pipe.transformer) # optional for faster speeds
pipe.enable_model_cpu_offload()
prompt = "A cat holding a sign that says hello world"
image = pipe(
prompt=prompt,
height=1024,
width=1024,
guidance_scale=1.0,
num_inference_steps=4,
generator=torch.manual_seed(0)
).images[0]
image.save("flux-klein-Blitz.png")
Original BF16 vs Blitz fine-tune comparison:
Big thanks to @alcaitiff for the awesome work and killer contributions to training Z-Image and Klein models! Seriously impressive stuff! 🚀
非常感谢 @alcaitiff 对 Zimage 和 Klein 9b 的模型训练做出的杰出贡献!
- Downloads last month
- 61
Model tree for GuangyuanSD/FLUX.2-klein-9B-Blitz-Diffusers
Base model
black-forest-labs/FLUX.2-klein-9B
