rename t5xxl_fp8_e4m3fn.safetensors for Diffusers module loading, please?

#8
by ppbrown - opened

RIght now, I think only the main fullsized version is loadable via

T5EncoderModel.from_pretrained()

We can trim it down a bit by adding (torch_dtype=torch.bf16) I think... but I dont know of any way currently to load the fp8.

Could you please rename t5xxl_fp8_e4m3fn.safetensors to, I think
model.fp8.safetensors

I believe that allows loading with diffusers, if you then specify
T5EncoderModel.from_pretrained("mcmonkey/google_t5-v1_1-xxl_encoderonly", variant="fp8")

?

Sign up or log in to comment