Spaces:
Running
on
Zero
Running
on
Zero
Runtime Error
#2
by
nviraj
- opened
Just a heads-up that the app is failing with this message:
Downloading shards: 0%| | 0/4 [00:00<?, ?it/s]
Downloading shards: 25%|โโโ | 1/4 [00:03<00:11, 3.72s/it]
Downloading shards: 50%|โโโโโ | 2/4 [00:07<00:07, 3.83s/it]
Downloading shards: 75%|โโโโโโโโ | 3/4 [00:11<00:03, 3.79s/it]
Downloading shards: 100%|โโโโโโโโโโ| 4/4 [00:12<00:00, 2.82s/it]
Downloading shards: 100%|โโโโโโโโโโ| 4/4 [00:12<00:00, 3.18s/it]
Traceback (most recent call last):
File "/home/user/app/app.py", line 57, in <module>
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto")
File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3775, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 1066, in __init__
self.model = LlamaModel(config)
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 845, in __init__
[LlamaDecoderLayer(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 845, in <listcomp>
[LlamaDecoderLayer(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 632, in __init__
self.self_attn = LLAMA_ATTENTION_CLASSES[config._attn_implementation](config=config, layer_idx=layer_idx)
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 306, in __init__
self.rotary_emb = LlamaRotaryEmbedding(config=self.config)
File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 110, in __init__
self.rope_type = config.rope_scaling.get("rope_type", config.rope_scaling["type"])
KeyError: 'type'
This has been fixed by upgrading the transformers
release 4.43.1
. Thanks for reporting.
ysharma
changed discussion status to
closed