Original model is here.
Notice
This is an experimental conversion in Spaces using a homebrew script. serverless Inference API does not currently support torch float8_e4m3fn, so it does not work. There are still many bugs around the FLUX.1 conversion part of Diffusers, and I have not been able to confirm if the conversion is working properly. Please consider this as a test run only.
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for John6666/blue-pencil-flux1-v001-fp8-flux
Base model
bluepen5805/blue_pencil-flux1