does this work with ollama?
#4
by
chovyfu
- opened
How do I pull it?
No ollama doesnot support mlx models. LM Studio does, but this model will not work there yet, because it is not supported by llama.cpp or mlx yet. Developers must update this model in those inference engines to use it in LM Studio or ollama.