local-ai-stack icon indicating copy to clipboard operation
local-ai-stack copied to clipboard

Where in code can you change the inference from Ollama to something else?

Open zxcvxzcv-johndoe opened this issue 1 year ago • 2 comments

I have managed to get this to run with Windows and WSL otherwise but in the last part it fails with this error below.

Where in code can I try to change it from using Ollama to something else like OpenAI (koboldcpp in practice)?

Thanks!

[llm/error] [1:llm:Ollama] [6ms] LLM run errored with error: "Unexpected end of JSON input"

  • error SyntaxError: Unexpected end of JSON input at JSON.parse () at parseJSONFromBytes (node:internal/deps/undici/undici:4553:19) at successSteps (node:internal/deps/undici/undici:4527:27) at fullyReadBody (node:internal/deps/undici/undici:1307:9) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async specConsumeBody (node:internal/deps/undici/undici:4536:7) at async createOllamaStream (webpack-internal:///(sc_server)/./node_modules/langchain/dist/util/ollama.js:26:21) at async Ollama._call (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/ollama.js:313:26) at async Promise.all (index 0) at async Ollama._generate (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:311:29) at async Ollama._generateUncached (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:174:22) at async Ollama.call (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:242:34) at async POST (webpack-internal:///(sc_server)/./src/app/api/qa-pg-vector/route.ts:61:24) at async eval (webpack-internal:///(sc_server)/./node_modules/next/dist/server/future/route-modules/app-route/module.js:242:37)

zxcvxzcv-johndoe avatar Nov 02 '23 21:11 zxcvxzcv-johndoe