flux1-kontext-dev-fp8
FP8-quantized weights for FLUX.1-Kontext-dev diffusion models. Supports both E4M3FN and E5M2 formats.
Model Overview
- Base Model: FLUX.1-Kontext-dev (diffusion model component)
- Quantization: Per-tensor dynamic quantization to FP8 (E4M3FN/E5M2)
- Size Reduction: ~40% smaller than original weights
- Model Scope: This is only the diffusion_model component (not full pipeline), should be placed in ComfyUI's
diffusion_models
directory - Compatibility:
- โ ComfyUI (with PyTorch 2.4+ and CUDA 12.4+)
- โ Diffusers (currently unsupported)
Available Files
flux1-kontext-dev-fp8-e4m3fn.safetensors
- Balanced performance/accuracyflux1-kontext-dev-fp8-e5m2.safetensors
- Higher throughput on Hopper GPUs
Usage
- Move the
.safetensors
file to your ComfyUIdiffusion_models
directory - Select fp8 modes in UNet Loader like
fp8_e4m3fn
.
Known Limitations
- Not compatible with standard Diffusers pipelines
- Requires patched PyTorch versions for optimal performance
License
This model is distributed under the FLUX.1(dev) Non-Commercial License. Commercial use prohibited without authorization.
Citation
@misc{labs2025flux1kontextflowmatching,
title={FLUX.1 Kontext: Flow Matching for In-Context Image Generation and Editing in Latent Space},
author={Black Forest Labs et al.},
year={2025},
eprint={2506.15742},
archivePrefix={arXiv},
primaryClass={cs.GR},
url={https://arxiv.org/abs/2506.15742},
}
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for 6chan/flux1-kontext-dev-fp8
Base model
black-forest-labs/FLUX.1-Kontext-dev