error
python model.py
Traceback (most recent call last):
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\utils_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 398, in cached_file
resolved_file = hf_hub_download(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\utils_validators.py", line 119, in _inner_fn
return fn(*args, **kwargs)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\file_download.py", line 1403, in hf_hub_download
raise head_call_error
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\file_download.py", line 1261, in hf_hub_download
metadata = get_hf_file_metadata(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\utils_validators.py", line 119, in _inner_fn
return fn(*args, **kwargs)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\file_download.py", line 1674, in get_hf_file_metadata
r = _request_wrapper(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\file_download.py", line 369, in _request_wrapper
response = _request_wrapper(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\file_download.py", line 393, in _request_wrapper
hf_raise_for_status(response)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\utils_errors.py", line 321, in hf_raise_for_status
raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-66330066-698e93b82b9a2d16194c2067;4dd45065-a957-45a4-bb8b-41e1205dba7f)
Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\bsssu\OneDrive\Desktop\t_t\t_t\model.py", line 7, in
model = AutoModelForCausalLM.from_pretrained(model_name)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Users\bsssu\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 416, in cached_file
raise EnvironmentError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
401 Client Error. (Request ID: Root=1-66330066-698e93b82b9a2d16194c2067;4dd45065-a957-45a4-bb8b-41e1205dba7f)
Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.
Seems like you have not raised access to llama3 via HuggingFace .
On the model card their is a form which you need to fill and have the same email id as you have on facebook in order to get it approved .
I received the same error but I was already granted access.
Same issue here
same issue, can anyone help
Same issue. At least, glad that I am not the only one with this problem.
same issue here
Same here. I received an email however it isn't letting me download it?
Hey all! Please make sure to log in in your environment using huggingface-cli login
. You can use huggingface-cli whoami
to verify your user. Make sure you're logged in and in the right user.
Hey all! Please make sure to log in in your environment using
huggingface-cli login
. You can usehuggingface-cli whoami
to verify your user. Make sure you're logged in and in the right user.
This solved my issue. Thank you!
sameeeeeee please grant me access
Facing same issue here...
Same issue.
'huggingface-cli login' can't help me .
Does anyone have any other methods?
huggingface-cli login but same error
@chad0714
Thank you, fixed.
edit permission was the issue.
it's frustrating, I'm still facing this issue.
- I have edited permission
- I have used huggingface-cli whoami to see my user name
- I have set HUGGINGFACE_HUB_TOKEN=my token
I am still having the same issue. I have made write token.
huggingface_hub.utils._errors.EntryNotFoundError: 404 Client Error. (Request ID: Root=1-669e1898-13f5755e41da289c55219976;1218340f-1902-41ee-86ae-d5ae02d1473e)
Entry Not Found for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/pytorch_model.bin.
I ran into the the same 401 client error. The model access is approved. Token is provided, but still 401. Seems so many people have this issue.
I have the same issue, have the access token, and model access is approved, but still 401.
I have the same issue, have the access token, and model access is approved, but still 401.
use this command: !huggingface-cli login
Input your finegrained or write token and you'll be good to go
Note: this is for colab.
not working for me either, followed instructions above. Any other ideas?
not working for me either, followed instructions above. Any other ideas?
What errors did you get?
Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct to ask for access.
I received approval and I'm authenticated.
Hi Guys.
You need to add all repository permission to the access token.
and before you use, set environment variable.
import os
os.environ['HF_TOKEN']="YOUR_HUGGING_FACE_ACCESS_TOKEN"
I already have the permission to access the model, but I still saw the 401 error message as shown below
File "/home/canh/anaconda3/envs/llama/lib/python3.11/site-packages/huggingface_hub/utils/_http.py", line 406, in hf_raise_for_status
response.raise_for_status()
File "/home/canh/anaconda3/envs/llama/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/8cde5ca8380496c9a6cc7ef3a8b46a0372a1d920/model-00001-of-00004.safetensors
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/canh/llama_download.py", line 7, in
snapshot_download(repo_id=repo_id, local_dir=local_dir, cache_dir=cache_dir)
File "/home/canh/anaconda3/envs/llama/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/canh/anaconda3/envs/llama/lib/python3.11/site-packages/huggingface_hub/_snapshot_download.py", line 290, in snapshot_download
thread_map(
File "/home/canh/anaconda3/envs/llama/lib/python3.11/site-packages/tqdm/contrib/concurrent.py", line 69, in thread_map
return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
I have the same issue, have the access token, and model access is approved, but still 401