Error during loading AutoProcessor

#3
by Nitesh-95 - opened

First of all many congratulations!

I was trying to test model locally and I got error during load of AutoProcessor Object
processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True)

Got this error
TypeError: Received a PatramImageProcessor for argument image_processor, but a ImageProcessingMixin was expected.

What could be reason of above error?

BharatGen AI org
edited Jun 7

Hello @Nitesh-95 . Thank you, means a lot.

Have you installed the proper requirements to support your machine? I have tested the code in the Model Card, and it worked for me. Could you install the requirements correctly? If your machine supports CUDA (NVIDIA GPUs), please try installing these and let us know.

huggingface_hub
torch
transformers
tensorflow
einops
accelerate>=0.26.0
torchvision
datasets
torchaudio
gradio
requests
Pillow 
pymupdf
pytesseract
pydantic==2.10.6

Hey @hristhiksagar-tih, it's working. Seems it was more like a environment issue.

Also I am using Apple M3 Max Chip.

If it is possible (not required), can you share pyprojectoml or requirements.txt file for the project assuming nvidia gpu

BharatGen AI org

Glad it's working.

For use on Apple silicon chip, please follow PyTorch's official guide on installation

And Transformers:

I have shared the requirements.txt file contents in the previous reply. Please follow that.

hrithiksagar-tih changed discussion status to closed

Sign up or log in to comment