AttributeError: 'InternVLChatConfig' object has no attribute 'llm_config'
#14
by
soniajoseph
- opened
Hello,
When I try to load the model as in the demo, I get the following error. Any ideas? Thanks a lot!
Model loading:
model_path = 'OpenGVLab/InternVideo2_5_Chat_8B'
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda().to(torch.bfloat16)
Error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[2], line 15
12 model_path = 'OpenGVLab/InternVideo2_5_Chat_8B'
14 tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
---> 15 model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda().to(torch.bfloat16)
18 model = AutoModel.from_pretrained("OpenGVLab/InternVideo2_5_Chat_8B", trust_remote_code=True)
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:547, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
544 if kwargs.get("quantization_config", None) is not None:
545 _ = kwargs.pop("quantization_config")
--> 547 config, kwargs = AutoConfig.from_pretrained(
548 pretrained_model_name_or_path,
549 return_unused_kwargs=True,
550 code_revision=code_revision,
551 _commit_hash=commit_hash,
552 **hub_kwargs,
553 **kwargs,
554 )
556 # if torch_dtype=auto was passed here, ensure to pass it on
557 if kwargs_orig.get("torch_dtype", None) == "auto":
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1187, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1183 config_class = get_class_from_dynamic_module(
1184 class_ref, pretrained_model_name_or_path, code_revision=code_revision, **kwargs
1185 )
1186 config_class.register_for_auto_class()
-> 1187 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)
1188 elif "model_type" in config_dict:
1189 try:
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:587, in PretrainedConfig.from_pretrained(cls, pretrained_model_name_or_path, cache_dir, force_download, local_files_only, token, revision, **kwargs)
581 if config_dict["model_type"] != cls.model_type:
582 logger.warning(
583 f"You are using a model of type {config_dict['model_type']} to instantiate a model of type "
584 f"{cls.model_type}. This is not supported for all configurations of models and can yield errors."
585 )
--> 587 return cls.from_dict(config_dict, **kwargs)
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:775, in PretrainedConfig.from_dict(cls, config_dict, **kwargs)
772 for key in to_remove:
773 kwargs.pop(key, None)
--> 775 logger.info(f"Model config {config}")
776 if return_unused_kwargs:
777 return config, kwargs
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:807, in PretrainedConfig.__repr__(self)
806 def __repr__(self):
--> 807 return f"{self.__class__.__name__} {self.to_json_string()}"
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:919, in PretrainedConfig.to_json_string(self, use_diff)
907 """
908 Serializes this instance to a JSON string.
909
(...)
916 `str`: String containing all the attributes that make up this configuration instance in JSON format.
917 """
918 if use_diff is True:
--> 919 config_dict = self.to_diff_dict()
920 else:
921 config_dict = self.to_dict()
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:827, in PretrainedConfig.to_diff_dict(self)
824 default_config_dict = PretrainedConfig().to_dict()
826 # get class specific config dict
--> 827 class_config_dict = self.__class__().to_dict() if not self.has_no_defaults_at_init else {}
829 serializable_config_dict = {}
831 # Only serialize values that differ from the default config,
832 # except always keep the 'config' attribute.
File ~/.cache/huggingface/modules/transformers_modules/OpenGVLab/InternVideo2_5_Chat_8B/bff14a1dd7647d9e81414bb91c59fdc983104f34/configuration_internvl_chat.py:85, in InternVLChatConfig.to_dict(self)
83 output = copy.deepcopy(self.__dict__)
84 output['vision_config'] = self.vision_config.to_dict()
---> 85 output['llm_config'] = self.llm_config.to_dict()
86 output['model_type'] = self.__class__.model_type
87 output['use_backbone_lora'] = self.use_backbone_lora
File ~/my_conda_envs/myenv/lib/python3.10/site-packages/transformers/configuration_utils.py:209, in PretrainedConfig.__getattribute__(self, key)
207 if key != "attribute_map" and key in super().__getattribute__("attribute_map"):
208 key = super().__getattribute__("attribute_map")[key]
--> 209 return super().__getattribute__(key)
AttributeError: 'InternVLChatConfig' object has no attribute 'llm_config'
same issue
bump - same issue
@soniajoseph
@hurayarah
@3thn
Hey guys, there is a simple way to fix this.
Simple Solution for InternVL Configuration Issue
(Tested with transformers v4.52.4)
BTW, I've tested the inference of this model after doing some modifications, and everything goes well. Don't worry :D
Required Modifications
Add Initialization (configuration_internvl_chat.py:49)
self.vision_config = InternVisionConfig(**vision_config) self.llm_config = None # Initialize llm_config to prevent AttributeError
Add Null Check (configuration_internvl_chat.py:85)
output['llm_config'] = self.llm_config.to_dict() if self.llm_config is not None else {}
Root Cause Analysis
When executing:
model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda().to(torch.bfloat16)
The following occurs:
- The Hugging Face framework downloads and parses
configuration_internvl_chat.py
- During config initialization (
transformers/configuration_utils.py:816-822
):config_dict = self.to_dict() # Get the default config dict (from a fresh PreTrainedConfig instance) default_config_dict = PretrainedConfig().to_dict() # get class specific config dict class_config_dict = self.__class__().to_dict() if not self.has_no_defaults_at_init else {}
- Key Issue:
self.llm_config
isNone
duringclass_config_dict
generation becausellm_config
is None- Without explicit initialization, this triggers an
AttributeError
when.to_dict()
is called
Why the Fix Works
- The initialization ensures
self.llm_config
always exists (even asNone
) - The null check prevents method calls on
None
while maintaining expected dictionary structure