TypeError: argument of type 'NoneType' is not iterable when running model
#9
by
jafthab
- opened
Hey !
I have used the code snippet below to run a test out the model, however I am met with an error (bottom). I hope to kindly seek your assistance, in case I am doing something wrong, Thanks !
Code snippet
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "sarvamai/sarvam-translate"
# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_name, cache_dir='/app/Models')
model = AutoModelForCausalLM.from_pretrained(model_name, cache_dir='/app/Models').to('cuda:0')
# Translation task
tgt_lang = "tamil"
input_txt = "Be the change you wish to see in the world."
# Chat-style message prompt
messages = [
{"role": "system", "content": f"Translate the text below to {tgt_lang}."},
{"role": "user", "content": input_txt}
]
# Apply chat template to structure the conversation
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
# Tokenize and move input to model device
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
# Generate the output
generated_ids = model.generate(
**model_inputs,
max_new_tokens=1024,
do_sample=True,
temperature=0.01,
num_return_sequences=1
)
output_ids = generated_ids[0][len(model_inputs.input_ids[0]):].tolist()
output_text = tokenizer.decode(output_ids, skip_special_tokens=True)
print("Input:", input_txt)
print("Translation:", output_text)
Transformers version
transformers==4.52.4
Error
Traceback (most recent call last):
File "/app/./model_scripts/sarvam.py", line 7, in <module>
model = AutoModelForCausalLM.from_pretrained(model_name, cache_dir='/app/Models').to('cuda:0')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/modeling_utils.py", line 309, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/modeling_utils.py", line 4508, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 1246, in __init__
self.model = Gemma3Model(config)
^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 1003, in __init__
language_model = AutoModel.from_config(config=config.text_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 440, in from_config
return model_class._from_config(config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/modeling_utils.py", line 309, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2077, in _from_config
model = cls(config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 556, in __init__
self.post_init()
File "/opt/conda/envs/jaw/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1969, in post_init
if v not in ALL_PARALLEL_STYLES:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable
Hey @jafthab
I just ran the same code you have pasted above, using the same transformers version you're using (4.52.4) and same python version you're using (3.11)
I don't face any issues. I get this output:
Input: Be the change you wish to see in the world.
Translation: உலகில் நீங்கள் காண விரும்பும் மாற்றமாக நீங்களே இருங்கள்.
You can try this suggestion reported by someone who faced the same error:
https://github.com/huggingface/transformers/issues/38340#issuecomment-2907735645
If that still doesn't solve your problem, please try vLLM.
GokulNC
changed discussion status to
closed
GokulNC
changed discussion status to
open