runtime error
Exit code: 1. Reason: izer_config.json: 0%| | 0.00/1.30k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 1.30k/1.30k [00:00<00:00, 7.86MB/s] vocab.txt: 0%| | 0.00/226k [00:00<?, ?B/s][A vocab.txt: 100%|██████████| 226k/226k [00:00<00:00, 85.9MB/s] tokenizer.json: 0%| | 0.00/706k [00:00<?, ?B/s][A tokenizer.json: 100%|██████████| 706k/706k [00:00<00:00, 69.9MB/s] added_tokens.json: 0%| | 0.00/74.0 [00:00<?, ?B/s][A added_tokens.json: 100%|██████████| 74.0/74.0 [00:00<00:00, 484kB/s] special_tokens_map.json: 0%| | 0.00/125 [00:00<?, ?B/s][A special_tokens_map.json: 100%|██████████| 125/125 [00:00<00:00, 828kB/s] 1_Pooling%2Fconfig.json: 0%| | 0.00/190 [00:00<?, ?B/s][A 1_Pooling%2Fconfig.json: 100%|██████████| 190/190 [00:00<00:00, 1.33MB/s] model.safetensors: 0%| | 0.00/438M [00:00<?, ?B/s][A model.safetensors: 5%|▍ | 21.0M/438M [00:01<00:22, 18.1MB/s][A model.safetensors: 100%|█████████▉| 438M/438M [00:01<00:00, 229MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 36, in <module> llm = LlamaCpp( File "/usr/local/lib/python3.10/site-packages/langchain_core/load/serializable.py", line 125, in __init__ super().__init__(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/pydantic/main.py", line 214, in __init__ validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) pydantic_core._pydantic_core.ValidationError: 1 validation error for LlamaCpp Value error, Could not load Llama model from path: /models/MaziyarPanahi/BioMistral-7B-GGUF. Received error Model path does not exist: /models/MaziyarPanahi/BioMistral-7B-GGUF [type=value_error, input_value={'model_path': '/models/M...: None, 'grammar': None}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/value_error
Container logs:
Fetching error logs...