runtime error
Exit code: 1. Reason: .weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' /usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py:2025: UserWarning: for decoder.lm_head.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' Loading checkpoint shards: 0%| | 0/7 [00:00<?, ?it/s][A Loading checkpoint shards: 14%|ββ | 1/7 [00:05<00:31, 5.28s/it][A Loading checkpoint shards: 100%|ββββββββββ| 7/7 [00:05<00:00, 1.33it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 47, in <module> tokenizer, model, image_processor, context_len = load_pretrained_model(args.model_path, args.model_base, model_name, args.load_8bit, args.load_4bit) File "/home/user/app/eagle/model/builder.py", line 99, in load_pretrained_model model = EagleLlamaForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3917, in from_pretrained dispatch_model(model, **device_map_kwargs) File "/usr/local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 364, in dispatch_model weights_map = OffloadedWeightsLoader( File "/usr/local/lib/python3.10/site-packages/accelerate/utils/offload.py", line 150, in __init__ raise ValueError("Need either a `state_dict` or a `save_folder` containing offloaded weights.") ValueError: Need either a `state_dict` or a `save_folder` containing offloaded weights.
Container logs:
Fetching error logs...