chat-ollama
chat-ollama copied to clipboard
错误 fetch failed 如果启用了代理,请确保代理正常工作。知识库添加正常,对话时响应失败
嵌入模型:nomic-embed-text:latest 对话模型:llama/llama3:latest
URL: /api/models/chat User: null Chat with knowledge base with id: 23 Knowledge base aaa with embedding "nomic-embed-text:latest" Creating embeddings for Ollama served model: nomic-embed-text:latest Creating Chroma vector store Initializing vector store retriever Chat with Ollama, host: http://127.0.0.1:11434 User query: 白鲸里面的船长是谁 Relevant documents: [ Document { pageContent: '1.海与鲸的诱惑\r\n\r\n\n\t很多年以前,那时我的钱包瘪瘪的,陆地上看来没什么好混得了,干脆下海吧,去在我们这个世界上占 ...... 漉的,很沉。\r\n\n\r\n\n\t 很难想像,那个标枪手穿上这样一件奇怪的衣服招摇过市!\r\n\n\r\n\n\t 我迫不及待地往下脱这毯子,情急之中扭了一下头,酸疼酸疼的'... 76512 more characters, metadata: { blobType: 'text/plain', source: '《白鲸》赫尔曼·麦尔维尔 2.txt' } } ]
ERROR [nuxt] [request error] [unhandled] [500] fetch failed
at Object.fetch (node:internal/deps/undici/undici:11731:11)
at async createOllamaStream (./node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected]__agb6zier7q7j3juxcraj2tnm3e/node_modules/@langchain/community/dist/utils/ollama.js:9:22)
at async createOllamaChatStream (./node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected]__agb6zier7q7j3juxcraj2tnm3e/node_modules/@langchain/community/dist/utils/ollama.js:57:5)
at async ChatOllama.streamResponseChunks (./node_modules/.pnpm/@[email protected]@[email protected][email protected][email protected]__agb6zier7q7j3juxcraj2tnm3e/node_modules/@langchain/community/dist/chat_models/ollama.js:396:30)
at async ChatOllama._streamIterator (./node_modules/.pnpm/@[email protected]/node_modules/@langchain/core/dist/language_models/chat_models.js:78:34)
at async ChatOllama.transform (./node_modules/.pnpm/@[email protected]/node_modules/@langchain/core/dist/runnables/base.js:375:9)
at async RunnableSequence._streamIterator (./node_modules/.pnpm/@[email protected]/node_modules/@langchain/core/dist/runnables/base.js:1116:30)