starcoder
starcoder copied to clipboard
How do I use the chat feature?
- Is there a saved checkpoint that I can use to load the chatting feature of the model?
- Also, I tried fine tuning the model by following the instructions given in the chat folder's README.md. I am getting, the below error, any help would be appreciated.
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status
response.raise_for_status()
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/requests/models.py", line 941, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/api/repos/create
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/hf_api.py", line 2185, in create_repo
hf_raise_for_status(r)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 301, in hf_raise_for_status
raise HfHubHTTPError(str(e), response=response) from e
huggingface_hub.utils._errors.HfHubHTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/api/repos/create (Request ID: Root=1-647a7100-1e66ef850974653b0ad5470f)
You don't have the rights to create a model under this namespace
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status
response.raise_for_status()
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/requests/models.py", line 941, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/api/models/lewtun/starchat-alpha
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "train.py", line 348, in <module>
main()
File "train.py", line 257, in main
trainer = Trainer(
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/transformers/trainer.py", line 551, in __init__
self.init_git_repo(at_init=True)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/transformers/trainer.py", line 3532, in init_git_repo
create_repo(repo_name, token=self.args.hub_token, private=self.args.hub_private_repo, exist_ok=True)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 120, in _inner_fn
return fn(*args, **kwargs)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/hf_api.py", line 2193, in create_repo
self.repo_info(repo_id=repo_id, repo_type=repo_type, token=token)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 120, in _inner_fn
return fn(*args, **kwargs)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/hf_api.py", line 1794, in repo_info
return method(
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 120, in _inner_fn
return fn(*args, **kwargs)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/hf_api.py", line 1604, in model_info
hf_raise_for_status(r)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-647a7100-127bed5713d651f01e5d2b19)
Repository Not Found for url: https://huggingface.co/api/models/lewtun/starchat-alpha.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635862 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635863 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635864 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635865 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635866 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635867 closing signal SIGTERM
WARNING:torch.distributed.elastic.multiprocessing.api:Sending process 635870 closing signal SIGTERM
ERROR:torch.distributed.elastic.multiprocessing.api:failed (exitcode: 1) local_rank: 0 (pid: 635861) of binary: /home/ec2-user/anaconda3/bin/python
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/bin/torchrun", line 8, in <module>
sys.exit(main())
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 346, in wrapper
return f(*args, **kwargs)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/torch/distributed/run.py", line 794, in main
run(args)
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/torch/distributed/run.py", line 785, in run
elastic_launch(
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/torch/distributed/launcher/api.py", line 134, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
File "/home/ec2-user/anaconda3/lib/python3.8/site-packages/torch/distributed/launcher/api.py", line 250, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Hi @GurpreetSingh97 I think this is a login error - can you please try running huggingface-cli login
and trying again?
@lewtun Hi I tried that, even with write token and it's still giving me the same error
Maybe you can try setting push_to_hub to False on config.yaml