issue loading the model

#1
by adrienchaton - opened

Hi and thanks for sharing your model!

I have successfully installed Evo's dependencies from the github source.
All seems importing fine (e.g. from evo import Evo) but once I try to load the model, I get this error

model = AutoModelForCausalLM.from_pretrained("togethercomputer/evo-1-131k-base", trust_remote_code=True)
--> ModuleNotFoundError: No module named 'transformers_modules.togethercomputer.evo-1-131k-base.9562f3fdc38f09b92594864c5e98264f1bfbca33.tokenizer'

It points to the last commit hash from the evo-1-131k-base model repository.

I have been trying different versions of transformers, also using the different instructions (e.g. using evo package instead of transformers), caching the checkpoint files locally, ... but still the same error when I try to load the model.

Do you have any idea on what to do to fix it please?

Together org

This is strange, what version of transformers are you using?

Thanks for your reply, let's follow-up here? https://github.com/evo-design/evo/issues/53

I tried both transformers==4.36.2 and transformers==4.39.3 which resulted in the same error (full trace posted on the github issue)

This looks familiar to an error which I got, and it helped to load the tokenizer with trust_remote_code=True first before loading the model.

same issue here

Sign up or log in to comment