how to fix the "ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported." while my tokenizer_config.json file is "tokenizer_class": "LlamaTokenizer", already .
#6
by
lishuangxiu-nuannuan
- opened
how to fix the "ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported." while my tokenizer_config.json file is "tokenizer_class": "LlamaTokenizer", already .
from transformers import LlamaTokenizer
tokenizer = LlamaTokenizer.from_pretrained("fireballoon/baichuan-vicuna-chinese-7b", use_fast=False)
i fix this problem, this happens because of the version of transformers is too old ,use " pip install --upgrade transformers ",then it fixed .
fireballoon
changed discussion status to
closed