Langchain-Chatchat icon indicating copy to clipboard operation
Langchain-Chatchat copied to clipboard

部署在线模型,启动后提问报错

Open qijianzhange opened this issue 1 year ago • 1 comments

操作系统:Windows-10-10.0.19045-SP0. python版本:3.11.8 (tags/v3.11.8:db85d51, Feb 6 2024, 22:03:32) [MSC v.1937 64 bit (AMD64)] 项目版本:v0.2.10 langchain版本:0.0.354. fastchat版本:0.2.35

当前使用的分词器:ChineseRecursiveTextSplitter 当前启动的LLM模型:['zhipu-api'] @ cpu {'api_key': '4346dc527aaf5985394c9563xxxxxxxxxxxxxxxxx', 'device': 'auto', 'host': '127.0.0.1', 'infer_turbo': False, 'online_api': True, 'port': 21001, 'provider': 'ChatGLMWorker', 'version': 'chatglm3-6b', 'worker_class': <class 'server.model_workers.zhipu.ChatGLMWorker'>} 当前Embbedings模型: zhipu-api @ cpu

服务端运行信息: OpenAI API Server: http://127.0.0.1:20000/v1 Chatchat API Server: http://127.0.0.1:7861 Chatchat WEBUI Server: http://127.0.0.1:8501 ==============================Langchain-Chatchat Configuration==============================

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:8501

2024-02-26 16:00:09,532 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56814 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:09,535 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 16:00:09,539 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56813 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:09,541 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 16:00:09,963 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56814 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:09,965 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56814 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 16:00:09,979 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" 2024-02-26 16:00:09,980 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56813 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:09,982 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56813 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 16:00:10,077 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" 2024-02-26 16:00:16,405 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56821 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:16,407 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 16:00:16,660 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56821 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 16:00:16,661 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56821 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 16:00:16,683 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56821 - "POST /chat/chat HTTP/1.1" 200 OK 2024-02-26 16:00:16,692 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/chat "HTTP/1.1 200 OK" E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_core_api\deprecation.py:117: LangChainDeprecationWarning: The class langchain_community.chat_models. openai.ChatOpenAI was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be u sed instead. To use it run pip install -U langchain-openai and import as from langchain_openai import ChatOpenAI. warn_deprecated( 2024-02-26 16:00:17 | INFO | stdout | INFO: 127.0.0.1:56824 - "POST /v1/chat/completions HTTP/1.1" 200 OK 2024-02-26 16:00:17,201 - _client.py[line:1758] - INFO: HTTP Request: POST http://127.0.0.1:20000/v1/chat/completions "HTTP/1.1 200 OK" 2024-02-26 16:00:17 | INFO | httpx | HTTP Request: POST http://127.0.0.1:21001/worker_generate_stream "HTTP/1.1 200 OK" 2024-02-26 16:00:18 | INFO | httpx | HTTP Request: POST https://open.bigmodel.cn/api/paas/v4/chat/completions "HTTP/1.1 500 " 2024-02-26 16:00:18,132 - utils.py[line:38] - ERROR: object of type 'NoneType' has no len() Traceback (most recent call last): File "E:\pycharm_project\Langchain-Chatchat-master\Langchain-Chatchat-master\server\utils.py", line 36, in wrap_done await fn File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain\chains\base.py", line 385, in acall raise e File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain\chains\base.py", line 379, in acall await self._acall(inputs, run_manager=run_manager) File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain\chains\llm.py", line 275, in _acall response = await self.agenerate([inputs], run_manager=run_manager) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain\chains\llm.py", line 142, in agenerate return await self.llm.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 554, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 514, in agenerate raise exceptions[0] File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 617, in _agenerate_with_cache return await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 522, in _agenerate return await agenerate_from_stream(stream_iter) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 87, in agenerate_from_stream async for chunk in stream: File "E:\pycharm_project\Langchain-Chatchat-master\venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 493, in _astream if len(chunk["choices"]) == 0: ^^^^^^^^^^^^^^^^^^^^^ TypeError: object of type 'NoneType' has no len() 2024-02-26 16:00:18,135 - utils.py[line:40] - ERROR: TypeError: Caught exception: object of type 'NoneType' has no len()

qijianzhange avatar Feb 26 '24 08:02 qijianzhange

🤖

嗨,@qijianzhange!很高兴再次见到你。🌟

您的问题看起来是在尝试获取None对象的长度时出现的。在Python中,None对象没有长度,所以在它们上调用len()会导致这个错误。

从您提供的日志中,我注意到错误发生在langchain_community\chat_models\openai.py文件的第493行,这一行代码试图获取chunk["choices"]的长度。如果chunk["choices"]是None,那么这就会导致您遇到的问题。

我在Langchain-Chatchat的问题跟踪中找到了一些类似的已解决问题,可能对您有所帮助:

  1. 对话时输入问题报错ERROR: object of type 'NoneType' has no len()
  2. ERROR: TypeError: Caught exception: object of type 'NoneType' has no len()
  3. minimax-api agent-chat对话报错ERROR: TypeError: Caught exception: object of type 'NoneType' has no len()

这些问题的解决方案主要是在尝试获取长度之前,检查chunk["choices"]是否为None。例如:

if chunk["choices"] is not None:
    if len(chunk["choices"]) == 0:
        # ...

或者:

chunk["choices"] = chunk.get("choices", [])
if len(chunk["choices"]) == 0:
    # ...

这些解决方案都是基于在Langchain-Chatchat存储库中解决的类似问题。如果这些解决方案不能解决问题,建议进一步调查minimax-api的响应和代码中对None值的处理。

希望这些信息对您有所帮助。如果您有任何其他问题或需要进一步的澄清,请随时告诉我。

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Feb 26 '24 08:02 dosubot[bot]

这就是没有成功访问,检查你的key和对应在线模型api的依赖有没有装

zRzRzRzRzRzRzR avatar Mar 02 '24 06:03 zRzRzRzRzRzRzR