LLaMA-Factory icon indicating copy to clipboard operation
LLaMA-Factory copied to clipboard

如何添加huggingface api

Open ClementCheng0217 opened this issue 9 months ago • 1 comments

Reminder

  • [X] I have read the README and searched the existing issues.

Reproduction

进行LLaMA微调时,无法获取模型报错 是否可以添加自己的huggingface api,还是说只能使用本地路径访问 Traceback (most recent call last): File "/root/miniconda3/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/root/miniconda3/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/root/autodl-tmp/LLaMA-Factory-main/src/llmtuner/train/tuner.py", line 33, in run_exp run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) File "/root/autodl-tmp/LLaMA-Factory-main/src/llmtuner/train/sft/workflow.py", line 31, in run_sft tokenizer_module = load_tokenizer(model_args) File "/root/autodl-tmp/LLaMA-Factory-main/src/llmtuner/model/loader.py", line 53, in load_tokenizer tokenizer = AutoTokenizer.from_pretrained( File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 819, in from_pretrained config = AutoConfig.from_pretrained( File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 928, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( File "/root/miniconda3/lib/python3.10/site-packages/transformers/utils/hub.py", line 416, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B. 401 Client Error. (Request ID: Root=1-66305bca-61baffb77fda36c2473f3c55;29db056c-bf8f-4bf6-9bab-d4177a8675cc)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json. Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.

Expected behavior

No response

System Info

No response

Others

No response

ClementCheng0217 avatar Apr 30 '24 02:04 ClementCheng0217

llama3 需要申请才能下载,可以到modelscope下载 https://www.modelscope.cn/models/LLM-Research/Meta-Llama-3-8B/summary

codemayq avatar Apr 30 '24 03:04 codemayq