Hugging face card

#2
by arminygh - opened

Hi,
Is there any hugging-face card of how model can be used for inference using python?

I recently used the transformer for inferencing but got the following error :
raise ValueError(
ValueError: The checkpoint you are trying to load has model type llava_llama but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

I used AutoModelForCausalLM for loading the model

Efficient-Large-Model org

Hi @arminygh , you will need to install the VILA repo to infer the model. You can refer to https://github.com/NVlabs/VILA?tab=readme-ov-file#inference

Sign up or log in to comment