Chunked Encoding Error

#106
by mattjarjoura - opened

Hi there,

Just wondering if anyone else has experienced this same issue when trying to load the model. I have tried running the provided trial script and am receiving the following error after this line;

model = AutoModelForCausalLM.from_pretrained("microsoft/Florence-2-large", torch_dtype=torch_dtype, trust_remote_code=True).eval().to(device)

Error during conversion: ChunkedEncodingError(ProtocolError('Response ended prematurely'))

Any help with this would be super appreciated

Sign up or log in to comment