Langchain-Chatchat
Langchain-Chatchat copied to clipboard
🔥 0.3.0 改进项与预期新增功能
🐞 改进项
✅ 已发布
- 修复依赖包不完整导致 chatchat-kb -r 命令执行过程中发生依赖库缺失的报错
🕑 已完成待发布
- 配置项除命令行修改方式外,重新支持本地文件修改方式
- 优化模型推理框架接入方式,减少所需配置内容,已支持 xinference 框架中除 audio 类模型外的其他类型模型中已启动模型的自动检测。
- 支持在 windows 系统中 ctrl+c 中止 chatchat-kb -r 过程
- 修复 max_token 不生效的问题
🏗️ 待开发完成
- 修复 ollama 接入过程中可能发生报错的问题
- 因 ollama 启动的模型名称中带有冒号导致无法完成知识库创建
💡 新增功能
✅ 已发布
🕑 已完成待发布
🏗️ 待开发完成
- 增加 rag 测试页面,可针对搜索结果及问答效果进行测试和对比
- 基于 lobe-chat 的前端页面
- 支持多模态对话
- 支持多模态 RAG
“max_token 不生效的问题”已修复
多模态对话和rag会在0.3.0发布吗
多模态对话和rag会在0.3.0发布吗
会。目前预期是一个版本修复bug,一个版本新增功能。
0.3.0会支持并发吗?
在ollama框架中启动langchain0.3版本进行知识库初始化还是报错chatchat-kb -r
(langchain) D:\other\Langchain-Chatchat-master>chatchat-kb -r
recreating all vector stores
C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
2024-07-03 10:22:16,321 - utils.py[line:260] - ERROR: failed to create Embeddings for model: bge-large-zh-v1.5.
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\utils.py", line 258, in get_Embeddings
return LocalAIEmbeddings(**params)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\pydantic\v1\main.py", line 341, in init
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)
2024-07-03 10:22:16,322 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents'
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
2024-07-03 10:22:16,323 - init_database.py[line:150] - ERROR: 向量库 samples 加载失败。
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 152, in folder2db kb.create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 2024-07-03 10:22:16,325 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...
在ollama框架中启动langchain0.3版本进行知识库初始化还是报错chatchat-kb -r
(langchain) D:\other\Langchain-Chatchat-master>chatchat-kb -r
recreating all vector stores
C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
2024-07-03 10:22:16,321 - utils.py[line:260] - ERROR: failed to create Embeddings for model: bge-large-zh-v1.5.
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\utils.py", line 258, in get_Embeddings
return LocalAIEmbeddings(**params)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\pydantic\v1\main.py", line 341, in init
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)
2024-07-03 10:22:16,322 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents'
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
2024-07-03 10:22:16,323 - init_database.py[line:150] - ERROR: 向量库 samples 加载失败。
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 152, in folder2db kb.create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 2024-07-03 10:22:16,325 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...
建议前端优化,参考主流的设计,比如“会话”应占据整个左侧边栏;知识库管理、模型配置等集成到左下角或右上角的设置界面。
新增text2promql功能
0.3.0会支持高并发吗
会支持微软的graphrag吗
0.3.0会支持并发吗?0.2.0对并发支持较差
没有GPU,用cpu可以跑吗?
@HaKLMTT 可以,用ollama
@HaKLMTT 可以,用ollama
好的,感谢回复。请问支持国产ARM吗?
0.3版本可以支持配置在线模型吗,现在看虽然可以用oneapi配置在线大模型,但是无法实现在线embedding模型的加载,同时0.3版本也移除了0.2版本的本地化embedding模型功能
会支持信创环境么
ollama什么时间可以支持
@lizhenkai5008 已经支持了
支持self-rag或者agentic RAG吗?比如抗战胜利的那一年罗斯福做了什么?这种多跳推理的RAG问题
支持语音识别吗
另外 微信二维码过期了,帮忙更新以下 @imClumsyPanda
@ClementeGao 二维码已更新
把agent做好,能解决很多问题,对智能化提升最明显。但目前agent问答存在以下问题:
- agent问答响应流程较慢,从问到回答30秒左右了,一般用户不能接受这个时间;
- 基于不同模型需要定制agent处理链路,比如项目中的chatglm3的agent调试到生成环境后,升级glm4,又要重新写一个
感谢chatchat项目组提供这么好的开源项目,上述agent问题,不知道大佬们有没好的建议
项目近几个月无重大更新,想进一步了解项目最新发展趋势,微信二维码过期了,请帮忙更新, 感谢。 @imClumsyPanda
建议RAG对话过程中,检索答案前加入AI问题补全,来保证上下文的对话中的内容检索不准确的问题
@ClementeGao 二维码已更新
二维码又过期了
@ClementeGao 二维码已更新
二维码又过期了
刚测试是还能用
Hey guy, The best improvement is to replace the AgentExecutor with the langgraph framework, which will fix all the problems you are having now !!!
Refer to : https://python.langchain.com/docs/concepts/agents/#legacy-agent-concept-agentexecutor
请问,在0.3.0版本,用Milvus向量库,删除知识库时,./chatchat_data/data/knowledge_base下的知识库的那个文件夹没被删掉怎么解决