use model run failed

#53
by pb-1 - opened

Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:39<00:00, 19.87s/it]
The following generation flags are not valid and may be ignored: ['top_p', 'top_k']. Set TRANSFORMERS_VERBOSITY=info for more details.
Traceback (most recent call last):
File "/root/data/vjuicefs_hz_cv_enhance_v1/72162227/multimodal_data/script/gemma_model_test/gemma_test.py", line 61, in
generation = model.generate(**inputs, max_new_tokens=100, do_sample=False)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 2597, in generate
result = self._sample(
File "/usr/local/lib/python3.10/dist-packages/torch/_dynamo/exc.py", line 172, in unimplemented
raise Unsupported(msg)
torch._dynamo.exc.Unsupported: call_method GetAttrVariable(UserDefinedObjectVariable(AttentionInterface), _global_mapping) getitem (ConstantVariable(str),) {}

from user code:
File "/usr/local/lib/python3.10/dist-packages/transformers/models/gemma3/modeling_gemma3.py", line 1345, in forward
outputs = self.model(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forwa

et TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information

You can suppress this exception and fall back to eager by setting:
import torch._dynamo
torch._dynamo.config.suppress_errors = True

Hi @pb-1 ,

Welcome to Google Gemma family of open source models, thanks for reporting the issue. I have reproduced the issue in my local. The above issue occurred due to the version compatibility among the different libraries/packages. To avoid the above issue please the add the following code and use the specific version of triton==3.2.0, latest version of transformer library.

!pip install -q -U transformers accelerate bitsandbytes
!pip install -U triton==3.2.0

import torch._dynamo
torch._dynamo.config.suppress_errors = True

Please find the following gist file for your reference. If you need any further assistance please reach out to me, I'm more than happy to help you out.

Thanks.

Sign up or log in to comment