Langchain-Chatchat icon indicating copy to clipboard operation
Langchain-Chatchat copied to clipboard

使用qwen-api执行python startup.py -a 报错

Open Adsryen opened this issue 1 year ago • 4 comments

==============================Langchain-Chatchat Configuration============================== 操作系统:Linux-6.5.0-1014-oem-x86_64-with-glibc2.35. python版本:3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] 项目版本:v0.2.10 langchain版本:0.0.354. fastchat版本:0.2.35

当前使用的分词器:ChineseRecursiveTextSplitter 当前启动的LLM模型:['qwen-api'] @ cuda {'api_key': 'sk-c07ddab24904', 'device': 'auto', 'embed_model': 'text-embedding-v1', 'host': '0.0.0.0', 'infer_turbo': False, 'online_api': True, 'port': 21006, 'provider': 'QwenWorker', 'version': 'qwen-max', 'worker_class': <class 'server.model_workers.qwen.QwenWorker'>} 当前Embbedings模型: bge-large-zh-v1.5 @ cuda ==============================Langchain-Chatchat Configuration==============================

2024-02-26 07:23:34,921 - startup.py[line:655] - INFO: 正在启动服务: 2024-02-26 07:23:34,921 - startup.py[line:656] - INFO: 如需查看 llm_api 日志,请前往 /home/daren/Langchain-Chatchat/logs /home/daren/.local/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: 模型启动功能将于 Langchain-Chatchat 0.3.x重写,支持更多模式和加速启动,0.2.x中相关功能将废弃 warn_deprecated( 2024-02-26 07:23:37 | ERROR | stderr | INFO: Started server process [144631] 2024-02-26 07:23:37 | ERROR | stderr | INFO: Waiting for application startup. 2024-02-26 07:23:37 | ERROR | stderr | INFO: Application startup complete. 2024-02-26 07:23:37 | INFO | model_worker | Register to controller 2024-02-26 07:23:37 | ERROR | stderr | INFO: Uvicorn running on http://0.0.0.0:20000 (Press CTRL+C to quit) Process api_worker - qwen-api: Traceback (most recent call last): File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/home/daren/Langchain-Chatchat/startup.py", line 389, in run_model_worker app = create_model_worker_app(log_level=log_level, **kwargs) File "/home/daren/Langchain-Chatchat/startup.py", line 100, in create_model_worker_app worker = worker_class(model_names=args.model_names, File "/home/daren/Langchain-Chatchat/server/model_workers/qwen.py", line 29, in init super().init(**kwargs) File "/home/daren/Langchain-Chatchat/server/model_workers/base.py", line 124, in init self.init_heart_beat() File "/home/daren/.local/lib/python3.10/site-packages/fastchat/serve/base_model_worker.py", line 79, in init_heart_beat self.register_to_controller() File "/home/daren/.local/lib/python3.10/site-packages/fastchat/serve/base_model_worker.py", line 97, in register_to_controller assert r.status_code == 200 AssertionError

Adsryen avatar Feb 26 '24 06:02 Adsryen

🤖

嘿,@Adsryen!很高兴再次见到你。希望一切都好!

根据您提供的错误日志,问题出现在register_to_controller函数中,这个函数的目的是将模型工作器注册到控制器。在这个过程中,它发送了一个POST请求到控制器的"/register"端点,并期望返回的状态码为200。然而,这个断言失败了,说明返回的状态码不是200,这可能是因为控制器没有正确地处理这个请求。

在Langchain-Chatchat的问题库中,我找到了两个类似的已解决问题:

  1. [BUG] 简洁阐述问题 / Concise description of the issue 这个问题的建议解决方案是检查你的依赖错误,这不是框架问题。

  2. [BUG] 报错AssertionError: Only one of "bf16", "fp16", "fp32" can be true 这个问题的建议解决方案是修改Qwen模型的config.json文件,将fp16设置为true,同时将bf16fp32设置为false

根据这些信息,我建议您首先检查您的依赖是否正确。如果问题仍然存在,您可以尝试修改Qwen模型的config.json文件,看看是否可以解决问题。

如果以上建议不能解决您的问题,那么可能需要更多的信息来确定问题的原因。您能否提供更多关于您的环境配置的信息,例如您的操作系统版本,Python版本,以及您是否在虚拟环境中运行这个程序?此外,如果您对代码做了任何修改,也请提供相关信息。这些信息将有助于我们更好地理解和解决您的问题。

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Feb 26 '24 06:02 dosubot[bot]

但是后面通过知识库问答的时候出现了api通讯问题,我使用的是在线版的qwen模型

2024-02-26 08:00:19,327 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:46290 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:19,328 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 08:00:19,461 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:46290 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:19,462 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:46290 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 08:00:19,466 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" 2024-02-26 08:00:23,128 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:52320 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:23,129 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 08:00:23,160 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:52320 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:23,161 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:52320 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 08:00:23,169 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:52320 - "GET /knowledge_base/list_knowledge_bases HTTP/1.1" 200 OK 2024-02-26 08:00:23,173 - _client.py[line:1027] - INFO: HTTP Request: GET http://127.0.0.1:7861/knowledge_base/list_knowledge_bases "HTTP/1.1 200 OK" 2024-02-26 08:00:31,910 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:50922 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:31,911 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" 2024-02-26 08:00:31,942 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:50922 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-26 08:00:31,943 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:50922 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-26 08:00:31,948 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:50922 - "GET /knowledge_base/list_knowledge_bases HTTP/1.1" 200 OK 2024-02-26 08:00:31,951 - _client.py[line:1027] - INFO: HTTP Request: GET http://127.0.0.1:7861/knowledge_base/list_knowledge_bases "HTTP/1.1 200 OK" INFO: 127.0.0.1:50922 - "POST /chat/knowledge_base_chat HTTP/1.1" 200 OK 2024-02-26 08:00:32,066 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/knowledge_base_chat "HTTP/1.1 200 OK" Batches: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 32.16it/s] -----------------model path------------------ BAAI/bge-reranker-large ERROR: Exception in ASGI application Traceback (most recent call last): File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 258, in wrap await func() File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 215, in listen_for_disconnect message = await receive() File "/home/daren/.local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 580, in receive await self.message_event.wait() File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f1a76533a90

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "/home/daren/.local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call | return await self.app(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call | await super().call(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/applications.py", line 119, in call | await self.middleware_stack(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call | raise exc | File "/home/daren/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call | await self.app(scope, receive, _send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in call | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app | raise exc | File "/home/daren/.local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/routing.py", line 762, in call | await self.middleware_stack(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/routing.py", line 782, in app | await route.handle(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle | await self.app(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/routing.py", line 77, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app | raise exc | File "/home/daren/.local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "/home/daren/.local/lib/python3.10/site-packages/starlette/routing.py", line 75, in app | await response(scope, receive, send) | File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 255, in call | async with anyio.create_task_group() as task_group: | File "/home/daren/.local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connection.py", line 198, in _new_conn | sock = connection.create_connection( | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection | raise err | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/util/connection.py", line 73, in create_connection | sock.connect(sa) | OSError: [Errno 101] Network is unreachable | | The above exception was the direct cause of the following exception: | | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 793, in urlopen | response = self._make_request( | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 491, in _make_request | raise new_e | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 467, in _make_request | self._validate_conn(conn) | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1099, in _validate_conn | conn.connect() | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connection.py", line 616, in connect | self.sock = sock = self._new_conn() | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connection.py", line 213, in _new_conn | raise NewConnectionError( | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f19f5ac1570>: Failed to establish a new connection: [Errno 101] Network is unreachable | | The above exception was the direct cause of the following exception: | | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/requests/adapters.py", line 486, in send | resp = conn.urlopen( | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 847, in urlopen | retries = retries.increment( | File "/home/daren/.local/lib/python3.10/site-packages/urllib3/util/retry.py", line 515, in increment | raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] | urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /BAAI/bge-reranker-large/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f19f5ac1570>: Failed to establish a new connection: [Errno 101] Network is unreachable')) | | During handling of the above exception, another exception occurred: | | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download | metadata = get_hf_file_metadata( | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn | return fn(*args, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata | r = _request_wrapper( | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper | response = _request_wrapper( | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 408, in _request_wrapper | response = get_session().request(method=method, url=url, **params) | File "/home/daren/.local/lib/python3.10/site-packages/requests/sessions.py", line 589, in request | resp = self.send(prep, **send_kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/requests/sessions.py", line 703, in send | r = adapter.send(request, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 67, in send | return super().send(request, *args, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/requests/adapters.py", line 519, in send | raise ConnectionError(e, request=request) | requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /BAAI/bge-reranker-large/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f19f5ac1570>: Failed to establish a new connection: [Errno 101] Network is unreachable'))"), '(Request ID: ca6862f9-93f4-4700-835b-469339037086)') | | The above exception was the direct cause of the following exception: | | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 385, in cached_file | resolved_file = hf_hub_download( | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn | return fn(*args, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1371, in hf_hub_download | raise LocalEntryNotFoundError( | huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on. | | The above exception was the direct cause of the following exception: | | Traceback (most recent call last): | File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 258, in wrap | await func() | File "/home/daren/.local/lib/python3.10/site-packages/sse_starlette/sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "/home/daren/Langchain-Chatchat/server/chat/knowledge_base_chat.py", line 92, in knowledge_base_chat_iterator | reranker_model = LangchainReranker(top_n=top_k, | File "/home/daren/Langchain-Chatchat/server/reranker/reranker.py", line 49, in init | self._model = CrossEncoder(model_name=model_name_or_path, max_length=1024, device=device) | File "/home/daren/.local/lib/python3.10/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py", line 39, in init | self.config = AutoConfig.from_pretrained(model_name) | File "/home/daren/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1100, in from_pretrained | config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 634, in get_config_dict | config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) | File "/home/daren/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict | resolved_config_file = cached_file( | File "/home/daren/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 425, in cached_file | raise EnvironmentError( | OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like BAAI/bge-reranker-large is not the path to a directory containing a file named config.json. | Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'. +------------------------------------ 2024-02-26 08:00:42,148 - utils.py[line:197] - ERROR: RemoteProtocolError: API通信遇到错误:peer closed connection without sending complete message body (incomplete chunked read)

Adsryen avatar Feb 26 '24 07:02 Adsryen

🤖

@imClumsyPanda, I've hit a roadblock with this one, your insights would be highly valuable.

dosubot[bot] avatar Feb 26 '24 07:02 dosubot[bot]

貌似是由于连接不上https://huggingface.co的原因,能将bge-reranker-large下载到本地运行吗?

Adsryen avatar Feb 26 '24 07:02 Adsryen

可以啊,我们本来就推荐使用本地的

zRzRzRzRzRzRzR avatar Feb 28 '24 04:02 zRzRzRzRzRzRzR