runtime error
00002.bin: 98%|█████████▊| 4.39G/4.48G [01:32<00:02, 37.4MB/s][A Downloading (…)l-00002-of-00002.bin: 98%|█████████▊| 4.40G/4.48G [01:33<00:01, 40.7MB/s][A Downloading (…)l-00002-of-00002.bin: 99%|█████████▊| 4.42G/4.48G [01:33<00:01, 52.4MB/s][A Downloading (…)l-00002-of-00002.bin: 99%|█████████▉| 4.44G/4.48G [01:33<00:01, 45.3MB/s][A Downloading (…)l-00002-of-00002.bin: 99%|█████████▉| 4.45G/4.48G [01:33<00:00, 46.6MB/s][A Downloading (…)l-00002-of-00002.bin: 99%|█████████▉| 4.46G/4.48G [01:34<00:00, 43.2MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|█████████▉| 4.47G/4.48G [01:34<00:00, 38.9MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|██████████| 4.48G/4.48G [01:35<00:00, 34.9MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|██████████| 4.48G/4.48G [01:35<00:00, 47.1MB/s] Downloading shards: 100%|██████████| 2/2 [05:06<00:00, 143.18s/it] Downloading shards: 100%|██████████| 2/2 [05:06<00:00, 153.39s/it] The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored. Traceback (most recent call last): File "/home/user/app/app.py", line 13, in <module> text_gen_pipeline = pipeline( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model tiiuae/falcon-7b-instruct with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.falcon.modeling_falcon.FalconForCausalLM'>).
Container logs:
Fetching error logs...