fail model load

#2
by byeolcardi - opened

I got a this error, how can i fix this?

Traceback (most recent call last):
File "/data/aidt_dev/aidt_diag_llm/aidt_diag_llm_courseware/courseware_unsloth_gemma3_4b_finetunning.py", line 84, in
model, tokenizer = FastLanguageModel.from_pretrained(
File "/ssd_data/anaconda3/envs/aidt_diag_llm/lib/python3.10/site-packages/unsloth/models/loader.py", line 193, in from_pretrained
raise RuntimeError(autoconfig_error or peft_error)
RuntimeError: The checkpoint you are trying to load has model type gemma3_text but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git

Unsloth AI org

I got a this error, how can i fix this?

Traceback (most recent call last):
File "/data/aidt_dev/aidt_diag_llm/aidt_diag_llm_courseware/courseware_unsloth_gemma3_4b_finetunning.py", line 84, in
model, tokenizer = FastLanguageModel.from_pretrained(
File "/ssd_data/anaconda3/envs/aidt_diag_llm/lib/python3.10/site-packages/unsloth/models/loader.py", line 193, in from_pretrained
raise RuntimeError(autoconfig_error or peft_error)
RuntimeError: The checkpoint you are trying to load has model type gemma3_text but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git

what are you using to run the model?

I ran pip install git+https://github.com/huggingface/transformers.git, and the following error message appeared. Does this mean it's not supported yet?

Traceback (most recent call last):
File "/data/aidt_dev/aidt_diag_llm/gemma3_api/courseware_unsloth_gemma3_4b_finetunning.py", line 88, in
model, tokenizer = FastLanguageModel.from_pretrained(
File "/ssd_data/anaconda3/envs/aidt_diag_llm/lib/python3.10/site-packages/unsloth/models/loader.py", line 273, in from_pretrained
raise NotImplementedError(
NotImplementedError: Unsloth: unsloth/gemma-3-1b-it-GGUF not supported yet!
Maybe you're doing vision finetuning? Please use FastVisionModel instead!
Otherwise, make an issue to https://github.com/unslothai/unsloth!

Unsloth AI org

Hi there @byeolcardi we reuploaded all of them so hopefully the issue is resolved.

We also released a Q8XL version which upcasts some layers to bf16 if youre interested

Sign up or log in to comment