Text Generation
Transformers
Safetensors
cohere2
conversational

cannot load model

#4
by g1a5535 - opened

{
"name": "ValueError",
"message": "The checkpoint you are trying to load has model type cohere2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.",
"stack": "---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
File ~/.local/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py:1034, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1033 try:
-> 1034 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1035 except KeyError:

File ~/.local/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py:736, in _LazyConfigMapping.getitem(self, key)
735 if key not in self._mapping:
--> 736 raise KeyError(key)
737 value = self._mapping[key]

KeyError: 'cohere2'

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
Cell In[5], line 3
1 model_id = "CohereForAI/c4ai-command-r7b-arabic-02-2025"
2 tokenizer = AutoTokenizer.from_pretrained(model_id)
----> 3 model = AutoModelForCausalLM.from_pretrained(model_id)

File ~/.local/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py:526, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
523 if kwargs.get("quantization_config", None) is not None:
524 _ = kwargs.pop("quantization_config")
--> 526 config, kwargs = AutoConfig.from_pretrained(
527 pretrained_model_name_or_path,
528 return_unused_kwargs=True,
529 trust_remote_code=trust_remote_code,
530 code_revision=code_revision,
531 _commit_hash=commit_hash,
532 **hub_kwargs,
533 **kwargs,
534 )
536 # if torch_dtype=auto was passed here, ensure to pass it on
537 if kwargs_orig.get("torch_dtype", None) == "auto":

File ~/.local/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py:1036, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1034 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1035 except KeyError:
-> 1036 raise ValueError(
1037 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
1038 "but Transformers does not recognize this architecture. This could be because of an "
1039 "issue with the checkpoint, or because your version of Transformers is out of date."
1040 )
1041 return config_class.from_dict(config_dict, **unused_kwargs)
1042 else:
1043 # Fallback: use pattern matching on the string.
1044 # We go from longer names to shorter names to catch roberta before bert (for instance)

ValueError: The checkpoint you are trying to load has model type cohere2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date."
}

transformers library is already latest version

Sign up or log in to comment