runtime error

Exit code: 1. Reason: MB/s] model.safetensors: 64%|██████▍ | 7.56G/11.8G [00:26<00:13, 311MB/s] model.safetensors: 67%|██████▋ | 7.93G/11.8G [00:27<00:12, 310MB/s] model.safetensors: 70%|███████ | 8.24G/11.8G [00:28<00:12, 289MB/s] model.safetensors: 74%|███████▎ | 8.66G/11.8G [00:29<00:09, 324MB/s] model.safetensors: 76%|███████▋ | 9.00G/11.8G [00:31<00:09, 298MB/s] model.safetensors: 79%|███████▉ | 9.32G/11.8G [00:32<00:08, 305MB/s] model.safetensors: 82%|████████▏ | 9.64G/11.8G [00:33<00:07, 296MB/s] model.safetensors: 85%|████████▌ | 10.0G/11.8G [00:34<00:05, 323MB/s] model.safetensors: 88%|████████▊ | 10.4G/11.8G [00:35<00:04, 303MB/s] model.safetensors: 91%|█████████▏| 10.7G/11.8G [00:36<00:03, 319MB/s] model.safetensors: 95%|█████████▍| 11.2G/11.8G [00:37<00:01, 341MB/s] model.safetensors: 100%|█████████▉| 11.8G/11.8G [00:38<00:00, 303MB/s] /home/user/.local/lib/python3.10/site-packages/psutil/__init__.py:2017: RuntimeWarning: available memory stats couldn't be determined and was set to 0 ret = _psplatform.virtual_memory() generation_config.json: 0%| | 0.00/142 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 142/142 [00:00<00:00, 60.8kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> model = T5ForConditionalGeneration.from_pretrained(model_name, device_map="auto") File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3917, in from_pretrained dispatch_model(model, **device_map_kwargs) File "/home/user/.local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 447, in dispatch_model raise ValueError( ValueError: You are trying to offload the whole model to the disk. Please use the `disk_offload` function instead.

Container logs:

Fetching error logs...