--- license: apache-2.0 --- copied from https://huggingface.co/google/flan-t5-xxl Converted to bfloat16, removed the decoder blocks and packaged for InvokeAI Incase you want to do the conversion yourself or on another model. You can find the code used to convert the original google/flan-t5-xxl model can be found in the root folder https://huggingface.co/skunkworx/FLAN-T5xxl/blob/main/convert-bf16-enc.py