--- language: - en license: other license_name: flux-1-dev-non-commercial-license license_link: https://github.com/black-forest-labs/flux/blob/main/model_licenses/LICENSE-FLUX1-dev base_model: - black-forest-labs/FLUX.1-Kontext-dev base_model_relation: quantized tags: - image-generation - flux - diffusion-single-file pipeline_tag: image-to-image --- # flux1-kontext-dev-fp8 FP8-quantized weights for [FLUX.1-Kontext-dev](https://huggingface.co/black-forest-labs/FLUX.1-Kontext-dev) diffusion models. Supports both E4M3FN and E5M2 formats. ## Model Overview - **Base Model**: [FLUX.1-Kontext-dev](https://huggingface.co/black-forest-labs/FLUX.1-Kontext-dev) (diffusion model component) - **Quantization**: Per-tensor dynamic quantization to FP8 (E4M3FN/E5M2) - **Size Reduction**: ~40% smaller than original weights - **Model Scope**: This is **only the diffusion_model** component (not full pipeline), should be placed in ComfyUI's `diffusion_models` directory - **Compatibility**: - ✅ ComfyUI (with PyTorch 2.4+ and CUDA 12.4+) - ❌ Diffusers (currently unsupported) ## Available Files - `flux1-kontext-dev-fp8-e4m3fn.safetensors` - Balanced performance/accuracy - `flux1-kontext-dev-fp8-e5m2.safetensors` - Higher throughput on Hopper GPUs ## Usage 1. Move the `.safetensors` file to your ComfyUI `diffusion_models` directory 2. Select fp8 modes in UNet Loader like `fp8_e4m3fn`. ## Known Limitations 1. Not compatible with standard Diffusers pipelines 2. Requires patched PyTorch versions for optimal performance ## License This model is distributed under the [FLUX.1(dev) Non-Commercial License](https://github.com/black-forest-labs/flux/blob/main/model_licenses/LICENSE-FLUX1-dev). Commercial use prohibited without authorization. ## Citation ```bib @misc{labs2025flux1kontextflowmatching, title={FLUX.1 Kontext: Flow Matching for In-Context Image Generation and Editing in Latent Space}, author={Black Forest Labs et al.}, year={2025}, eprint={2506.15742}, archivePrefix={arXiv}, primaryClass={cs.GR}, url={https://arxiv.org/abs/2506.15742}, } ```