runtime error

Exit code: 1. Reason: β–ˆβ–ˆ| 9.09M/9.09M [00:00<00:00, 41.3MB/s] special_tokens_map.json: 0%| | 0.00/73.0 [00:00<?, ?B/s] special_tokens_map.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 73.0/73.0 [00:00<00:00, 568kB/s] config.json: 0%| | 0.00/844 [00:00<?, ?B/s] config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 844/844 [00:00<00:00, 6.85MB/s] You don't have a GPU available to load the model, the inference will be slow because of weight unpacking model.safetensors: 0%| | 0.00/1.18G [00:00<?, ?B/s] model.safetensors: 2%|▏ | 26.6M/1.18G [00:01<00:44, 26.1MB/s] model.safetensors: 24%|β–ˆβ–ˆβ–Ž | 277M/1.18G [00:02<00:05, 157MB/s]  model.safetensors: 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 889M/1.18G [00:03<00:00, 363MB/s] model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1.18G/1.18G [00:03<00:00, 333MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 21, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 282, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4470, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4909, in _load_pretrained_model disk_offload_index, cpu_offload_index = _load_state_dict_into_meta_model( File "/usr/local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 735, in _load_state_dict_into_meta_model file_pointer = safe_open(shard_file, framework="pt", device=tensor_device) safetensors_rust.SafetensorError: device disk is invalid

Container logs:

Fetching error logs...