vllm

MLX / MPS users out of luck and can't use this model with VLLM

#4
by kronosprime - opened

VLLM doesn't support Apple machines, first requested back in October 2023: https://github.com/vllm-project/vllm/issues/1441

They closed the issue. NVIDIA is a stingy greed freak about RAM, and with Apple we get unified memory, so we can build servers with 512GB ram usable with our GPUs.

Is there another way to load Pixtral without VLLM?

Thanks

Any updates? Wanna run this on my Mac!

Sign up or log in to comment