runtime error

Exit code: 1. Reason: tokenizer_config.json: 0%| | 0.00/1.34k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 1.34k/1.34k [00:00<00:00, 6.45MB/s] tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 137MB/s] tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 13.6MB/s] special_tokens_map.json: 0%| | 0.00/552 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 552/552 [00:00<00:00, 3.24MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> tokenizer = AutoTokenizer.from_pretrained(model_name) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 1013, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2025, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2278, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 154, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 108, in __init__ raise ValueError( ValueError: Cannot instantiate this tokenizer from a slow version. If it's based on sentencepiece, make sure you have sentencepiece installed.

Container logs:

Fetching error logs...