ChatGLM2-6B icon indicating copy to clipboard operation
ChatGLM2-6B copied to clipboard

cli_demo.py在进行几轮对话之后,会出现TypeError

Open gekrnwmb1864 opened this issue 2 years ago • 2 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

Traceback (most recent call last): File "/root/ChatGLM-6B-main/cli_demo.py", line 58, in main() File "/root/ChatGLM-6B-main/cli_demo.py", line 43, in main for response, history in model.stream_chat(tokenizer, query, history=history): File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/root/.cache/huggingface/modules/transformers_modules/chatglm2-6b/modeling_chatglm.py", line 964, in stream_chat inputs = self.build_inputs(tokenizer, query, history=history) File "/root/.cache/huggingface/modules/transformers_modules/chatglm2-6b/modeling_chatglm.py", line 917, in build_inputs inputs = tokenizer([prompt], return_tensors="pt") File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2548, in call encodings = self._call_one(text=text, text_pair=text_pair, **all_kwargs) File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2634, in _call_one return self.batch_encode_plus( File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2825, in batch_encode_plus return self._batch_encode_plus( File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 733, in _batch_encode_plus first_ids = get_input_ids(ids) File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 700, in get_input_ids tokens = self.tokenize(text, **kwargs) File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 547, in tokenize tokenized_text.extend(self._tokenize(token)) File "/root/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 104, in _tokenize return self.tokenizer.tokenize(text) File "/root/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 32, in tokenize return self.sp_model.EncodeAsPieces(s) File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/sentencepiece/init.py", line 545, in EncodeAsPieces return self.Encode(input=input, out_type=str, **kwargs) File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/sentencepiece/init.py", line 531, in Encode return self._EncodeAsPieces(input, enable_sampling, nbest_size, File "/root/miniconda3/envs/myconda/lib/python3.9/site-packages/sentencepiece/init.py", line 316, in _EncodeAsPieces return _sentencepiece.SentencePieceProcessor__EncodeAsPieces(self, text, enable_sampling, nbest_size, alpha, add_bos, add_eos, reverse, emit_unk_piece) TypeError: not a string

Expected Behavior

No response

Steps To Reproduce

直接运行了cli_demo.py并且在前两轮对话正常,后面会出现TypeError

Environment

- OS:
- Python:3.9.16
- Transformers:4.29.1
- PyTorch:2.0
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response

gekrnwmb1864 avatar Jun 26 '23 02:06 gekrnwmb1864

在我们DB-GPT项目中,已经无缝支持了chatglm2-6b 模型,详细见文档 https://github.com/csunny/DB-GPT/blob/main/README.zh.md, 可以使用界面模式对话 😄

csunny avatar Jun 26 '23 02:06 csunny

同样的问题 not a string

zhangyunming avatar Jun 27 '23 06:06 zhangyunming