transformers
transformers copied to clipboard
Cannot load "ybelkada/opt-350m-lora" model from PEFT documentation example
This repository is focused on the Hub experience and documentation. If you're facing an issue with a specific library, please open an issue in the corresponding GitHub repo. If you're facing an issue with a specific model or dataset, please open an issue in the corresponding HF repo.
Bug description. Cannot load "ybelkada/opt-350m-lora" model from documentation example
from transformers import AutoModelForCausalLM, AutoTokenizer
peft_model_id = "ybelkada/opt-350m-lora"
model = AutoModelForCausalLM.from_pretrained(peft_model_id)
Error
---------------------------------------------------------------------------
HTTPError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py](https://localhost:8080/#) in hf_raise_for_status(response, endpoint_name)
303 try:
--> 304 response.raise_for_status()
305 except HTTPError as e:
16 frames
HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/ybelkada/opt-350m-lora/resolve/main/config.json
The above exception was the direct cause of the following exception:
EntryNotFoundError Traceback (most recent call last)
EntryNotFoundError: 404 Client Error. (Request ID: Root=1-665951ec-3185ad8a46ad858d5db0a021;e7a16693-27a1-4417-8033-76220c7ba041)
Entry Not Found for url: https://huggingface.co/ybelkada/opt-350m-lora/resolve/main/config.json.
The above exception was the direct cause of the following exception:
OSError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py](https://localhost:8080/#) in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
451 if revision is None:
452 revision = "main"
--> 453 raise EnvironmentError(
454 f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
455 f"'[https://huggingface.co/{path_or_repo_id}/tree/{revision}'](https://huggingface.co/%7Bpath_or_repo_id%7D/tree/%7Brevision%7D') for available files."
OSError: ybelkada/opt-350m-lora does not appear to have a file named config.json. Checkout 'https://huggingface.co/ybelkada/opt-350m-lora/tree/main' for available files.
Describe the expected behaviour Expected to load model and implement
Additional context
- peft package installed successfully
- Code run on Google Colab
- Documentation section url - https://huggingface.co/docs/transformers/en/peft#load-a-peft-adapter