You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trying to download meta-llama/Llama-2-7b results in access restricted error. Doing so from within a docker container on a remote server. Optional paramter to explicitly specify hf_token would be helpful.
$python3 ggify.py meta-llama/Llama-2-7b
Repo meta-llama/Llama-2-7b does not seem to contain a PyTorch model, but continuing anyway
Responsible-Use-Guide.pdf (1 MB): 30%|███████████████████████████████████████▌ | 3/10 [00:00<00:00, 20.13file/s]
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-2-7b/resolve/main/Responsible-Use-Guide.pdf
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/root/ggify/ggify.py", line 313, in<module>main()
File "/root/ggify/ggify.py", line 296, in main
download_repo(repo, dirname)
File "/root/ggify/ggify.py", line 237, in download_repo
huggingface_hub.hf_hub_download(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1202, in hf_hub_download
return _hf_hub_download_to_local_dir(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1440, in _hf_hub_download_to_local_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1823, in _raise_on_head_call_error
raise head_call_error
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1722, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1645, in get_hf_file_metadata
r = _request_wrapper(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 372, in _request_wrapper
response = _request_wrapper(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 396, in _request_wrapper
hf_raise_for_status(response)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py", line 321, in hf_raise_for_status
raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-66fce491-20ec7153448a39ae4964d49c;5af058dd-42f5-47a8-a3b9-ae7de0c9921a)
Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-7b/resolve/main/Responsible-Use-Guide.pdf.
Access to model meta-llama/Llama-2-7b is restricted. You must have access to it and be authenticated to access it. Please log in.
The text was updated successfully, but these errors were encountered:
Trying to download
meta-llama/Llama-2-7b
results in access restricted error. Doing so from within a docker container on a remote server. Optional paramter to explicitly specify hf_token would be helpful.The text was updated successfully, but these errors were encountered: