ModuleNotFoundError: No module named 'transformers_modules.BytedanceDouyinContent.SAIL-VL-1'
I'm encountering a problem when I try to use your model for inference. I created a fresh conda env (python=3.10.16) and installed the required packages (pip3 install einops transformers timm). But when I run:
path = "BytedanceDouyinContent/SAIL-VL-1.5-2B"
model = AutoModel.from_pretrained(
path,
torch_dtype=torch.bfloat16,
trust_remote_code=True).eval().cuda()
tokenizer = AutoTokenizer.from_pretrained(path, trust_remote_code=True, use_fast=False)
I get the following error:
ModuleNotFoundError: No module named 'transformers_modules.BytedanceDouyinContent.SAIL-VL-1'
Any idea why this might be happening?
Yes, I am also having the same problem.
Sorry for the late reply. This is due to the name of the model folder, you can remove any dot (.) in the name, e.g., from 1.5 -> 1d5, to solve this issue.
You are using a model of type internvl_chat to instantiate a model of type sailvl. This is not supported for all configurations of models and can yield errors.