Langchain-Chatchat icon indicating copy to clipboard operation
Langchain-Chatchat copied to clipboard

[BUG] 使用chatyuan模型时,对话Error,has no attribute 'stream_chat'

Open cocomany opened this issue 1 year ago • 1 comments

问题描述 / Problem Description 用简洁明了的语言描述这个问题 / Describe the problem in a clear and concise manner. 使用chatyuan模型时,对话Error, 提示AttributeError: 'T5ForConditionalGeneration' object has no attribute 'stream_chat'

复现问题的步骤 / Steps to Reproduce

  1. 模型配置选 chatyuan
  2. 等待日志中输出"模型已成功重新加载,可以开始对话,或从右侧选择模式后开始对话"
  3. 选择对话
  4. 输入问题点提交

预期的结果 / Expected Result 描述应该出现的结果 / Describe the expected result. 可继续输入对话

实际结果 / Actual Result 描述实际发生的结果 / Describe the actual result. 对话框下显示Error

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:(例如:v1.0.0 或 commit 123456) / langchain-ChatGLM version/commit number: 3605057
  • 是否使用 Docker 部署(是/否):是 / Is Docker deployment used (yes/no): yes
  • 使用的模型(ChatGLM-6B / ClueAI/ChatYuan-large-v2 等):ChatGLM-6B / Model used (ChatGLM-6B / ClueAI/ChatYuan-large-v2, etc.): chatyuan
  • 使用的 Embedding 模型(GanymedeNil/text2vec-large-chinese 等):GanymedeNil/text2vec-large-chinese / Embedding model used (GanymedeNil/text2vec-large-chinese, etc.): GanymedeNil/text2vec-large-chinese
  • 操作系统及版本 / Operating system and version: Ubuntu 20
  • Python 版本 / Python version: python3.8
  • 其他相关环境信息 / Other relevant environment information:

附加信息 / Additional Information 添加与问题相关的任何其他信息 / Add any other information related to the issue. 错误日志: 模型已成功重新加载,可以开始对话,或从右侧选择模式后开始对话 Traceback (most recent call last): File "/usr/local/lib/python3.8/dist-packages/gradio/routes.py", line 412, in run_predict output = await app.get_blocks().process_api( File "/usr/local/lib/python3.8/dist-packages/gradio/blocks.py", line 1299, in process_api result = await self.call_function( File "/usr/local/lib/python3.8/dist-packages/gradio/blocks.py", line 1035, in call_function prediction = await anyio.to_thread.run_sync( File "/usr/local/lib/python3.8/dist-packages/anyio/to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/usr/local/lib/python3.8/dist-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "/usr/local/lib/python3.8/dist-packages/anyio/_backends/_asyncio.py", line 867, in run result = context.run(func, *args) File "/usr/local/lib/python3.8/dist-packages/gradio/utils.py", line 491, in async_iteration return next(iterator) File "webui.py", line 52, in get_answer for resp, history in local_doc_qa.llm._call(query, history, File "/chatGLM/models/chatglm_llm.py", line 65, in _call for inum, (stream_resp, _) in enumerate(self.model.stream_chat( File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1614, in getattr raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'T5ForConditionalGeneration' object has no attribute 'stream_chat'

cocomany avatar May 08 '23 12:05 cocomany

t5 不支持 straming_chat , 你直接把 streaming = False 在 config 里面改下。应该就可以了

xx-zhang avatar May 09 '23 10:05 xx-zhang