obsidian-smart-connections
obsidian-smart-connections copied to clipboard
Unexpected end of JSON input while trying to connect to local llama3.1 ollama server
I did a fresh install of this plugin, created embeddings with the default model and have been trying to connect to the ollama server but I keep getting these errors:
- With streaming on
- With streaming disabled
My settings:
(I picked the max input tokens randomly)
I am on a powershell terminal and ran this:
set OLLAMA_ORIGINS=*
ollama serve
The ollama server seems to be receveing these requests