Getting a JSONDecodeError: Invalid \escape when calling the AutoProcessor now.
It works with the stable transformers version 4.42.0, so it doesn't really matter but a JSONDecodeError: Invalid \escape when calling the AutoProcessor happens with transformers version: 4.43.0.dev0. I believe it's something to do with the chat_template.json that was added recently.
The AutoProcessor seems to grab the LlavaNextImageProcessor correctly.
CODE:
from transformers import LlavaForConditionalGeneration, AutoTokenizer, AutoProcessor
model_id="llava-hf/llava-v1.6-mistral-7b-hf"
tokenizer = AutoTokenizer.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)
ERROR:
JSONDecodeError Traceback (most recent call last)
in <cell line: 17>()
15
16 tokenizer = AutoTokenizer.from_pretrained(model_id)
---> 17 processor = AutoProcessor.from_pretrained(model_id)
5 frames
/usr/lib/python3.10/json/decoder.py in raw_decode(self, s, idx)
351 """
352 try:
--> 353 obj, end = self.scan_once(s, idx)
354 except StopIteration as err:
355 raise JSONDecodeError("Expecting value", s, err.value) from None
JSONDecodeError: Invalid \escape: line 2 column 474 (char 475)
Hi,
I'm unable to reproduce your issue, could you retry?
>>> from transformers import AutoProcessor
>>> processor = AutoProcessor.from_pretrained("llava-hf/llava-v1.6-mistral-7b-hf")
chat_template.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 588/588 [00:00<00:00, 242kB/s]
Thanks nielsr, but RaushanTurganbay just fixed it.