FLUX.1-Kontext 8-bit Quantized for MLX
This is an 8-bit quantized version of the FLUX.1-Kontext-dev model optimized for use with MLX and flux.swift. The model size has been reduced from ~24GB to 17GB while maintaining excellent image-to-image generation quality.
Quantized using flux.swift, a Swift implementation of FLUX models for Apple Silicon.
Model Details
- Quantization: 8-bit with group size 128
- Total Size: 17GB
- Original Model: black-forest-labs/FLUX.1-Kontext-dev
- Framework: MLX (Metal Performance Shaders)
- Components: Transformer, VAE, CLIP text encoder, T5 text encoder
Usage
This model requires the flux.swift implementation. Please refer to the repository for installation and usage instructions.
Quick Start
# Load and use the quantized model for image-to-image generation
flux.swift.cli \
--load-quantized-path /path/to/this/model \
--hf-token YOUR_HF_TOKEN \
--init-image-path /path/to/input/image.png \
--prompt "Your transformation prompt here" \
--output output.png
Recommended Parameters
- Steps: 30
- Guidance Scale: 3.5
- Authentication: Requires Hugging Face token
- Input: Requires
--init-image-path
for image-to-image generation
Example with Parameters
flux.swift.cli \
--load-quantized-path /path/to/this/model \
--hf-token YOUR_HF_TOKEN \
--init-image-path ./reference.png \
--prompt "transform this into a watercolor painting with soft pastel tones" \
--steps 30 \
--guidance 3.5 \
--width 512 \
--height 512 \
--seed 42 \
--output watercolor_painting.png
License
This model is a quantized version of FLUX.1-Kontext-dev, which is licensed under the FLUX.1 [dev] Non-Commercial License. Please review the original license terms:
- Original model: https://huggingface.co/black-forest-labs/FLUX.1-Kontext-dev
- License: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
- Commercial licensing available at: https://bfl.ai/pricing/licensing
Performance
- Memory Usage: Reduced from ~24GB to 17GB
- Quality: Excellent preservation of image-to-image generation quality
- Platform: Optimized for Apple Silicon Macs
Citation
@misc{flux-kontext,
author = {Black Forest Labs},
title = {FLUX.1-Kontext-dev},
publisher = {Black Forest Labs},
year = {2024},
url = {https://huggingface.co/black-forest-labs/FLUX.1-Kontext-dev}
}
@software{flux-swift,
author = {mzbac},
title = {flux.swift: Swift implementation of FLUX models},
url = {https://github.com/mzbac/flux.swift},
year = {2024}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support