Open-Llama icon indicating copy to clipboard operation
Open-Llama copied to clipboard

Cannot download the checkpoints from huggingface

Open GeorgeMarica opened this issue 1 year ago • 0 comments

Hello, Tried to download the checkpoints of open llama v2 from https://huggingface.co/s-JoL/Open-Llama-V2 The link is not available anymore. Tried to do the same from python

tokenizer = AutoTokenizer.from_pretrained("s-JoL/Open-Llama-V2", use_fast=False)

Traceback (most recent call last): File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\utils_errors.py", line 264, in hf_raise_for_status response.raise_for_status() File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\requests\models.py", line 960, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/s-JoL/Open-Llama-V2/resolve/main/tokenizer_config.json The above exception was the direct cause of the following exception: Traceback (most recent call last): File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\utils_validators.py", line 124, in _inner_fn return fn(*args, **kwargs) File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\file_download.py", line 1105, in hf_hub_download metadata = get_hf_file_metadata( File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\utils_validators.py", line 124, in _inner_fn return fn(*args, **kwargs) File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\file_download.py", line 1440, in get_hf_file_metadata hf_raise_for_status(r) File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\huggingface_hub\utils_errors.py", line 306, in hf_raise_for_status raise RepositoryNotFoundError(message, response) from e huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-64663c34-6d0e4375452da99b5a32f9d6) Repository Not Found for url: https://huggingface.co/s-JoL/Open-Llama-V2/resolve/main/tokenizer_config.json. Please make sure you specified the correct repo_id and repo_type. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\IPython\core\interactiveshell.py", line 3361, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "", line 1, in <cell line: 1> tokenizer = AutoTokenizer.from_pretrained("s-JoL/Open-Llama-V2", use_fast=False) File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 619, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 463, in get_tokenizer_config resolved_config_file = cached_file( File "C:\Users\z0047d2j\Miniconda3\lib\site-packages\transformers\utils\hub.py", line 424, in cached_file raise EnvironmentError( OSError: s-JoL/Open-Llama-V2 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.

GeorgeMarica avatar May 18 '23 14:05 GeorgeMarica