Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma'
Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma' ????
update your transformers to 4.38 or 4.38.1 (for pytorch 2.1)
Name: transformers
Version: 4.38.1
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: [email protected]
License: Apache 2.0 License
Location: C:\Users\Administrator\AppData\Local\NVIDIA\MiniConda\envs\gcn\Lib\site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
I have already used all the specified versions, but it still doesn't work
Why doesn't it work no matter how I update it? Is everyone else experiencing the same issue?
Make sure to do pip install -U transformers
and restart your environment. If you're in Colab, it will keep using the older versions if you don't restart.
Hi @chenwei1984 , Could you please try again by updating the latest transformer version (!pip install -U transformers) and let us know if the issue still persists? Thank you.