Could you please help me fix this error?
I get this ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.minlik.docllm-yi-6b.3c185962853128319a5b66f579a07d733827645c.configuration_llama.LlamaConfig'> and you passed <class 'transformers.models.llama.configuration_llama.LlamaConfig'>. Fix one of those so they match!
if I use huggingface transformers (4.45) but if I downgrade it to 4.28/4.3 I get transformers.cache_utils not found
error. Using venv as well.
I get this
ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.minlik.docllm-yi-6b.3c185962853128319a5b66f579a07d733827645c.configuration_llama.LlamaConfig'> and you passed <class 'transformers.models.llama.configuration_llama.LlamaConfig'>. Fix one of those so they match!
if I use huggingface transformers (4.45) but if I downgrade it to 4.28/4.3 I gettransformers.cache_utils not found
error. Using venv as well.
set trust_remote_code=True
It’s set True but I still get it
The issue fixed, you can try it again.