Official model split demo code can not run

#11
by HarrytheOrange - opened

at top just imported : from transformers import AutoModel, AutoTokenizer

However, in the model split function, it requires config = AutoConfig.from_pretrained(model_path, trust_remote_code=True). Did you test the code before releasing?

Furthermore, in
def split_model(model_name):
device_map = {}
world_size = torch.cuda.device_count()
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)

Where did you transfer the model_path into the function?

Thanks

Sign up or log in to comment