Does vllm 0.8.4 support this quantized model?

#1
by traphix - opened

Does vllm 0.8.4 support this quantized model?

Red Hat AI org

Yes it does.

Sign up or log in to comment