error loading model: done_getting_tensors: wrong number of tensors; expected 292, got 291

#5
by quantumalchemy - opened

error loading model: done_getting_tensors: wrong number of tensors; expected 292, got 291

You need to update llama.cpp or the frontend you use to interact with it

same problem here, using OpenWebUI

@raphaelfontes Try to update OpenWebUI. I had the same issue with LM Studio but they released a new version yesterday with an updated llama.cpp that fixed it.

Sadly the backend used in openwebui is ollama, with has not been updated yet, but the original llama3.1 works, but not this one.

@Isaak-Carter
There shouldn't be any problem with the latest ollama v0.3.0 or pre-release v0.3.1.
At least at loading the model correctly.

The model is available here:
https://ollama.com/mannix/llama3.1-8b-abliterated

yeah ok with latest ollama .. but I need llamafile (using llama.cpp)
hope they update soon..they usually do

still facing the same error. wrong number of tensors; expected 292, got 291

@Neonvarun Hey, I am also facing the same error. Were you able to resolve this error? I fine tuned llama3.1-70B-Instruct model, merged it with base and converted it to gguf format using llama.cpp/convert_hf_to_gguf.py and hosted it on ollama. I have got this error: Error: llama runner process has terminated: error loading model: done_getting_tensors: wrong number of tensors; expected 724, got 723
llama_load_model_from_file: exception loading model.

Any help is appreciated. Thanks!

@Rmote6603 Sorry I couldn't resolve it some are saying it's issue with llama.cpp not updated ,but still it wasn't working for me.
I gave up and used Mistral and Llama 3.0 for my Project.

Sign up or log in to comment