Error: llama runner process has terminated: error loading model: missing tensor 'output.weight' llama_load_model_from_file: failed to load model

#1
by dbmuller - opened

Error: llama runner process has terminated: error loading model: missing tensor 'output.weight'
llama_load_model_from_file: failed to load model

I had the same problem when I tried converting it.

Also I take it this might need a newer build of llama.cpp and maybe won't work in ollama yet?

DevQuasar org

I've tested it with llama.cpp build from the branch I've shared on the model card:

Screenshot 2025-02-27 at 9.53.19 PM.png

At the model page https://ollama.com/library/phi4-mini:3.8b there is the following note:

Note: this model requires Ollama 0.5.13 which is currently in pre-release.

I can confirm that updating to the latest version of Ollama fix this issue :)

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment