Jakob Hoeg Mørk

Results 10 comments of Jakob Hoeg Mørk

> ECONNREFUSED indicates Ollama server isn't running. Can you check it is running and accessible on localhost:11434? It is running and accessible.

> @jakobhoeg Did you manage to get this working? @mxyng 's point is correct; maybe you were running it on a different port or a different host? Hey. No, still...

> Can you clarify where everything is deployed? You mentioned something is deployed in Vercel but the wording is vague. I assume the NextJS app you're calling Ollama from. If...

> @jakobhoeg looks like this could be an issue with host resolution when using the langchain library rather than ollama, could you try using `127.0.0.1` rather than `localhost`? > >...

Also tried with this in the api route: `export const runtime = 'edge';` And now I get a different error in Vercel: ``` Error: Ollama call failed with status code...

@BruceMacD I am using Vercels `useChat()` from their `@ai/react` package on the client side to call the api route if that helps?

I also tried using the Ollama OpenAI completions instead and followed the guide found [here](https://ollama.com/blog/openai-compatibility) (the Vercel AI SDK part). So my API route (/api/chat/route.ts) looks like this: ``` import...

@joeylin you can also checkout my repository: https://github.com/jakobhoeg/shadcn-chat

Thanks for the feedback. I am aware that in production the response is not being streamed as of right now. I am working on this. The other issue of the...

> @jakobhoeg Any idea when this will be worked on? This would probably bring the most benefit to users from the current state. If you host ollama on a slow...