Serving in Ollama

#1
by vadimkantorov - opened

Hi!

Could you please advise how to type a Modelfile for serving in Ollama?

Any specific TEMPLATE needed? Any other configs?

Thank you! :)

So far I do:

GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/AI-MO/Kimina-Prover-Preview-Distill-7B /home/user/Kimina-Prover-Preview-Distill-7B
cd /home/user/Kimina-Prover-Preview-Distill-7B
git lfs fetch
git lfs ls-files -l | while read SHA DASH FILEPATH; do mv ".git/lfs/objects/${SHA:0:2}/${SHA:2:2}/$SHA" "$FILEPATH"; done

echo "FROM /home/user/Kimina-Prover-Preview-Distill-7B" > Modelfile
echo "PARAMETER use_mmap true" >> Modelfile

ollama create Kimina-Prover-Preview-Distill-7B

Sign up or log in to comment