Unable to load and run model using vLLM

#5
by hailexiao - opened

When I try to load and run the model using vLLM with the command

vllm serve "mohbattharani/LLaVA-Chef"

I receive the following error:

TypeError: Special token pad_token has to be either str or AddedToken but got: <class 'int'>

I noticed that in special_tokens_map.json, the value for the key "pad_token" is -100; should it be "-100" instead, since a string is expected?

Sign up or log in to comment