Loading with AutoModelForCausalLM.from_pretrained
#2
by
AKQuestSage
- opened
Hi, great work! I tried to load it using the following code:AutoModelForCausalLM.from_pretrained( model_path_or_repo_id="bartowski", model_file="Mistral-7B-Instruct-v0.3-IQ4_XS.gguf", model_type="mistral", gpu_layers=20, hf=True, context_length=4096)
but it returns the following error:RuntimeError: Failed to create LLM 'mistral' from 'Mistral-7B-Instruct-v0.3-IQ4_XS.gguf'
Need your help and if you have any ideas about this error. Thank you.
What library are you using? You're probably best off using llama-cpp-python