ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

[BUG/Help] <title>404 Client Error: Not Found for url:https://huggingface.co/THUDM/chatglm-6b/resolve/main/added_tokens.json

Open tommyjex opened this issue 1 year ago • 0 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)

Expected Behavior

No response

Steps To Reproduce

Explicitly passing a revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Downloading ice_text.model: 100%|██████████| 2.70M/2.70M [01:41<00:00, 26.6kB/s]

HTTPError Traceback (most recent call last) File ~/anaconda3/envs/ml/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py:259, in hf_raise_for_status(response, endpoint_name) 258 try: --> 259 response.raise_for_status() 260 except HTTPError as e:

File ~/anaconda3/envs/ml/lib/python3.10/site-packages/requests/models.py:1021, in Response.raise_for_status(self) 1020 if http_error_msg: -> 1021 raise HTTPError(http_error_msg, response=self)

HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/THUDM/chatglm-6b/resolve/main/added_tokens.json `

Environment

- OS:Ubuntu 20.04
- Python:3.10
- Transformers:4.26.1
- PyTorch:2.0.0
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :True

Anything else?

No response

tommyjex avatar Mar 25 '23 16:03 tommyjex