vllm version
#2
by
baiall
- opened
what kind of version can fit for that model ? currently version is 0.8.5.post1 but it dosen't work (i had test success in the other LLM moedel),it report issue that
Could you try our example? (Vanilla vLLM)
https://github.com/float16-cloud/examples/tree/main/official/spot/vllm-offline-inference-typhoon-ocr-7b