DB-GPT
DB-GPT copied to clipboard
[Bug] [Module Name] Bug title 源码启动卡在begin run _add_app_startup_event
Search before asking
- [X] I had searched in the issues and found no similar issues.
Operating system information
Linux
Python version information
3.10
DB-GPT version
main
Related scenes
- [ ] Chat Data
- [ ] Chat Excel
- [ ] Chat DB
- [ ] Chat Knowledge
- [ ] Model Management
- [ ] Dashboard
- [ ] Plugins
Installation Information
-
[ ] AutoDL Image
-
[ ] Other
Device information
CPU Count:4 CPU Memory:32G
Models information
LLM: proxyllm Embedding Model:m3e-base
What happened
按照文档步骤操作,启动之后卡在begin run _add_app_startup_event不动。已经更新最新代码。分别在MAC,Ubuntu环境下进行测试,都是同样的问题
What you expected to happen
INFO [pilot.model.cluster.worker.default_worker] Begin load model, model params:
=========================== ProxyModelParameters ===========================
model_name: proxyllm model_path: chatgpt_proxyllm proxy_server_url: https://api.openai.com/v1/chat/completions proxy_api_key: s******7 proxy_api_base: None proxy_api_app_id: None proxy_api_secret: None proxy_api_type: None proxy_api_version: None http_proxy: None proxyllm_backend: None model_type: proxy device: cpu prompt_template: None max_context_size: 4096
======================================================================
INFO [pilot.model.loader] Load proxyllm INFO: 127.0.0.1:54596 - "POST /api/controller/models HTTP/1.1" 200 OK INFO: 127.0.0.1:54598 - "POST /api/controller/models HTTP/1.1" 200 OK begin run _add_app_startup_event
How to reproduce
按照文档部署,选OPENAI + M3E-BASE应该可以复现
Additional context
INFO [pilot.model.cluster.worker.default_worker] Begin load model, model params:
=========================== ProxyModelParameters ===========================
model_name: proxyllm model_path: chatgpt_proxyllm proxy_server_url: https://api.openai.com/v1/chat/completions proxy_api_key: s******7 proxy_api_base: None proxy_api_app_id: None proxy_api_secret: None proxy_api_type: None proxy_api_version: None http_proxy: None proxyllm_backend: None model_type: proxy device: cpu prompt_template: None max_context_size: 4096
======================================================================
INFO [pilot.model.loader] Load proxyllm INFO: 127.0.0.1:54596 - "POST /api/controller/models HTTP/1.1" 200 OK INFO: 127.0.0.1:54598 - "POST /api/controller/models HTTP/1.1" 200 OK begin run _add_app_startup_event
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
有大佬解决没?
有大佬解决没?
你也有同样的问题吗,我切回老版本没问题了
所以到这步还没启动结束是吗?那你在这步访问网站前端会出现Uncaught SyntaxError: Unexpected token '<'报错吗?
我也是这样 INFO: 127.0.0.1:53394 - "POST /api/controller/models HTTP/1.1" 200 OK INFO: 127.0.0.1:53420 - "POST /api/controller/models HTTP/1.1" 200 OK begin run _add_app_startup_event
我发现我的问题了,对我来说,似乎卡在这里并没有什么问题。我因为连接的是跳板机,所以端口没有开放给本地,所以无法访问到。
当我尝试vscode直接连接服务器的时候,就可以正常访问了。
This issue has been marked as stale, because it has been over 30 days without any activity.
各位大佬 问题解决了吗 怎样解决的 请指导一下
浏览器里试下访问http://127.0.0.1:5000/
This issue has been marked as stale, because it has been over 30 days without any activity.
This issue bas been closed, because it has been marked as stale and there has been no activity for over 7 days.
I have encountered the same problem, how can I solve it? INFO: Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit) INFO: 127.0.0.1:55758 - "POST /api/controller/models HTTP/1.1" 200 OK INFO: 127.0.0.1:55780 - "POST /api/controller/models HTTP/1.1" 200 OK begin run _add_app_startup_event
http://127.0.0.1:5000/ could be done