Langchain-Chatchat
Langchain-Chatchat copied to clipboard
点击“知识库管理”后报错 TypeError: string indices must be integers, not 'str'
==============================Langchain-Chatchat Configuration============================== 操作系统:Windows-10-10.0.22631-SP0. python版本:3.11.7 | packaged by conda-forge | (main, Dec 23 2023, 14:27:59) [MSC v.1937 64 bit (AMD64)] 项目版本:v0.2.10 langchain版本:0.1.12. fastchat版本:0.2.35
当前使用的分词器:ChineseRecursiveTextSplitter 当前启动的LLM模型:['chatglm2-6b'] @ cuda {'device': 'cuda', 'host': '127.0.0.1', 'infer_turbo': False, 'model_path': 'E:\models\chatglm2-6b', 'model_path_exists': True, 'port': 20002} 当前Embbedings模型: bge-large-zh-v1.5 @ cuda
服务端运行信息: OpenAI API Server: http://127.0.0.1:20000/v1 Chatchat API Server: http://127.0.0.1:7861 Chatchat WEBUI Server: http://127.0.0.1:8501 ==============================Langchain-Chatchat Configuration==============================
You can now view your Streamlit app in your browser.
URL: http://127.0.0.1:8501
D:\Anaconda\envs\langchain\Lib\site-packages\langchain\chat_models_init_.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
from langchain_community.chat_models import ChatOpenAI
.
To install langchain-community run pip install -U langchain-community
.
warnings.warn(
D:\Anaconda\envs\langchain\Lib\site-packages\langchain\llms_init_.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
from langchain_community.llms import OpenAI
.
To install langchain-community run pip install -U langchain-community
.
warnings.warn(
D:\Anaconda\envs\langchain\Lib\site-packages\langchain\document_loaders_init_.py:36: LangChainDeprecationWarning: Importing document loaders from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
from langchain_community.document_loaders import JSONLoader
.
To install langchain-community run pip install -U langchain-community
.
warnings.warn(
2024-03-22 11:59:23,010 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57555 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-03-22 11:59:23,012 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
2024-03-22 11:59:23,246 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57555 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-03-22 11:59:23,247 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57555 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK
2024-03-22 11:59:23,261 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK"
2024-03-22 11:59:36,464 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57592 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-03-22 11:59:36,466 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
2024-03-22 11:59:36,591 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57592 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-03-22 11:59:36,593 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57592 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK
2024-03-22 11:59:36,606 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:57592 - "POST /chat/chat HTTP/1.1" 200 OK
2024-03-22 11:59:36,611 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/chat "HTTP/1.1 200 OK"
D:\Anaconda\envs\langchain\Lib\site-packages\langchain_core_api\deprecation.py:117: LangChainDeprecationWarning: The class langchain_community.chat_models.openai.ChatOpenAI
was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run pip install -U langchain-openai
and import as from langchain_openai import ChatOpenAI
.
warn_deprecated(
D:\Anaconda\envs\langchain\Lib\site-packages\langchain_core_api\deprecation.py:117: LangChainDeprecationWarning: The function acall
was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use ainvoke instead.
warn_deprecated(
2024-03-22 11:59:36 | INFO | stdout | INFO: 127.0.0.1:57595 - "POST /v1/chat/completions HTTP/1.1" 200 OK
2024-03-22 11:59:36,970 - client.py[line:1758] - INFO: HTTP Request: POST http://127.0.0.1:20000/v1/chat/completions "HTTP/1.1 200 OK"
2024-03-22 11:59:37 | INFO | httpx | HTTP Request: POST http://127.0.0.1:20002/worker_generate_stream "HTTP/1.1 200 OK"
2024-03-22 11:59:48,075 - faiss_cache.py[line:92] - INFO: loading vector store in 'samples/vector_store/bge-large-zh-v1.5' from disk.
D:\Anaconda\envs\langchain\Lib\site-packages\langchain\embeddings_init.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
from langchain_community.embeddings import HuggingFaceBgeEmbeddings
.
To install langchain-community run pip install -U langchain-community
.
warnings.warn(
2024-03-22 11:59:48,158 - SentenceTransformer.py[line:66] - INFO: Load pretrained SentenceTransformer: E:\models\Embedding model\bge-large-zh-v1.5
INFO: 127.0.0.1:57602 - "POST /knowledge_base/search_docs HTTP/1.1" 500 Internal Server Error
2024-03-22 11:59:50,774 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/knowledge_base/search_docs "HTTP/1.1 500 Internal Server Error"
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "D:\Anaconda\envs\langchain\Lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\applications.py", line 119, in call
await self.middleware_stack(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\middleware\errors.py", line 186, in call
raise exc
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\middleware\errors.py", line 164, in call
await self.app(scope, receive, _send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
raise exc
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\routing.py", line 762, in call
await self.middleware_stack(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\routing.py", line 782, in app
await route.handle(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\routing.py", line 297, in handle
await self.app(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
raise exc
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\routing.py", line 72, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\fastapi\routing.py", line 299, in app
raise e
File "D:\Anaconda\envs\langchain\Lib\site-packages\fastapi\routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\starlette\concurrency.py", line 40, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\anyio_backends_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Langchain-Chatchat-master\server\knowledge_base\kb_doc_api.py", line 41, in search_docs
data = kb.list_docs(file_name=file_name, metadata=metadata)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Langchain-Chatchat-master\server\knowledge_base\kb_service\base.py", line 214, in list_docs
doc_info = self.get_doc_by_ids([x["id"]])[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Langchain-Chatchat-master\server\knowledge_base\kb_service\faiss_kb_service.py", line 36, in get_doc_by_ids
with self.load_vector_store().acquire() as vs:
^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Langchain-Chatchat-master\server\knowledge_base\kb_service\faiss_kb_service.py", line 28, in load_vector_store
return kb_faiss_pool.load_vector_store(kb_name=self.kb_name,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Langchain-Chatchat-master\server\knowledge_base\kb_cache\faiss_cache.py", line 97, in load_vector_store
vector_store = FAISS.load_local(vs_path, embeddings, normalize_L2=True,distance_strategy="METRIC_INNER_PRODUCT")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Anaconda\envs\langchain\Lib\site-packages\langchain_community\vectorstores\faiss.py", line 1078, in load_local
raise ValueError(
ValueError: The de-serialization relies loading a pickle file. Pickle files can be modified to deliver a malicious payload that results in execution of arbitrary code on your machine.You will need to set allow_dangerous_deserialization
to True
to enable deserialization. If you do this, make sure that you trust the source of the data. For example, if you are loading a file that you created, and no that no one else has modified the file, then this is safe to do. Do not set this to True
if you are loading a file from an untrusted source (e.g., some random site on the internet.).
2024-03-22 11:59:50.785 Uncaught app exception
Traceback (most recent call last):
File "D:\Anaconda\envs\langchain\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 535, in _run_script
exec(code, module.dict)
File "E:\Langchain-Chatchat-master\webui.py", line 64, in
Hey @winer3086! Great to see you back. Hope you're doing well. 😊
Based on the information you've provided, it seems like you're encountering a TypeError: string indices must be integers, not 'str'
when clicking on "Knowledge Base Management". This error typically occurs when you try to access a string using string indices instead of integers. In the context of Langchain-Chatchat v0.2.10, this error could be happening because the code might be attempting to access elements of a string as if it were a dictionary. This can happen if the code expects a dictionary (or JSON object) but receives a string instead.
To resolve this issue, ensure that any JSON data you're working with is properly deserialized (converted from a JSON string to a Python dictionary) before trying to access its elements. In Python, you can use the json.loads()
function from the json
module to deserialize a JSON string. Here's an example:
import json
# Assuming json_string is a string containing JSON data
json_string = '{"key": "value"}'
# Deserialize the JSON string to a Python dictionary
data = json.loads(json_string)
# Now you can access data using string indices
print(data["key"])
Regarding the ValueError related to the de-serialization of a pickle file, this error can occur if the pickle file is corrupted, was created with a different version of Python, or if there's a mismatch between the Python objects used during pickling and unpickling. To resolve this issue, ensure that:
- The pickle file is not corrupted and is accessible.
- You're using the same version of Python and the same environment (libraries, dependencies) for both pickling and unpickling.
- If the pickle file was created with a different version of Python, consider regenerating the pickle file with the current version you're using.
Here's a basic example of how to safely unpickle data, assuming you have a valid and compatible pickle file:
import pickle
# Replace 'your_pickle_file.pkl' with the path to your pickle file
pickle_file_path = 'your_pickle_file.pkl'
# Safely unpickle the data
try:
with open(pickle_file_path, 'rb') as file:
data = pickle.load(file)
except Exception as e:
print(f"An error occurred: {e}")
Make sure to handle exceptions and errors gracefully to avoid crashes and to provide more informative error messages.
In the Langchain-Chatchat codebase, the function that is likely throwing the TypeError is search_knowledge_base_iter
in the file server/agent/tools/search_knowledgebase_once.py
. This function assumes data
is a dictionary, but if data
is actually a string at this point, trying to use string indices ("answer"
and "docs"
) would raise the mentioned TypeError.
The function responsible for handling the "Knowledge Base Management" click event in the provided codebase from the webui_pages/knowledge_base/knowledge_base.py
file is knowledge_base_page
. This function is designed to manage various aspects of knowledge base management, including creating a new knowledge base, uploading documents to a knowledge base, updating knowledge base information, and deleting documents or the entire knowledge base.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
Relevant documentation and code:
Sources
- TypeError: string indices must be integers
- knowledge_base/samples/content/test_files/langchain-ChatGLM_open.jsonl
- server/agent/tools/search_knowledgebase_once.py
- webui_pages/knowledge_base/knowledge_base.py
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
这个问题已经被标记为 stale
,因为它已经超过 30 天没有任何活动。
解决了吗,同样遇到这样的问题
TypeError: string indices must be integers
Traceback:
File "/opt/conda/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script
exec(code, module.dict)
File "/usr/local/serving/serving/webui.py", line 87, in
解决了吗,同样遇到这样的问题 TypeError: string indices must be integers Traceback: File "/opt/conda/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script exec(code, module.dict) File "/usr/local/serving/serving/webui.py", line 87, in pages[selected_page]["func"](api=api, is_lite=is_lite) File "/usr/local/serving/serving/webui_pages/knowledge_base/knowledge_base.py", line 338, in knowledge_base_page data = [ File "/usr/local/serving/serving/webui_pages/knowledge_base/knowledge_base.py", line 339, in {"seq": i + 1, "id": x["id"], "page_content": x["page_content"], "source": x["metadata"].get("source"),
请问您现在解决了嘛,我遇到同样的问题
一个128k的模型无限自问自答好像只能先弃用,langchain加载个人库报这个错,怎么修复有人知道吗
2024-05-27 16:32:22,181 - utils.py[line:95] - ERROR: ReadTimeout: error when post /knowledge_base/create_knowledge_base: timed out 2024-05-27 16:36:43,277 - utils.py[line:95] - ERROR: ReadTimeout: error when post /knowledge_base/search_docs: timed out 2024-05-27 16:36:43.277 Uncaught app exception Traceback (most recent call last): File "/root/langchain_pip/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script exec(code, module.dict) File "/sdb/chatgpt/Langchain-Chatchat/webui.py", line 64, in pages[selected_page]["func"](api=api, is_lite=is_lite) File "/sdb/chatgpt/Langchain-Chatchat/webui_pages/knowledge_base/knowledge_base.py", line 330, in knowledge_base_page data = [ ^ File "/sdb/chatgpt/Langchain-Chatchat/webui_pages/knowledge_base/knowledge_base.py", line 331, in {"seq": i + 1, "id": x["id"], "page_content": x["page_content"], "source": x["metadata"].get("source"), ~^^^^^^ TypeError: string indices must be integers, not 'str'
大佬们你们解决了吗?如何解决的方便告知下吗? @aben1900 @winer3086 @nilin1998 @AIdrinkhotWater