copied from https://huggingface.co/google/flan-t5-xxl
Converted to bfloat16, removed the decoder blocks and packaged for InvokeAI
Incase you want to do the conversion yourself or on another model. You can find the code used to convert the original google/flan-t5-xxl model can be found in the root folder https://huggingface.co/skunkworx/FLAN-T5xxl/blob/main/convert-bf16-enc.py
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support