SmolVLM2 Support

#156
by PlayAI - opened

image.png

When will SmolVLM2 be supported?

When llama.cpp add support but I see they are in no hurry because they recently added support to phi-4 but not to smoll

Two related issues on github:
https://github.com/ggerganov/llama.cpp/issues/10877
https://github.com/ggerganov/llama.cpp/issues/11682

PR on github with fixes to be able to run llava, mobilevlm, minicpm-v 2.6, smolVLM and eventually llama 3.2 vision. (You're waiting for this pr to be merged into llama.cpp(eventually))
https://github.com/ggml-org/llama.cpp/pull/11292

Sign up or log in to comment