waveterm icon indicating copy to clipboard operation
waveterm copied to clipboard

[Bug]: Ollama integration with Wave Term not Reading terminal

Open will-wrigh opened this issue 2 months ago • 1 comments

Current Behavior

WAVE AI does not link with Ollama, Wave AI widget does, but doesn't read terminal.

Image Image

//this is when I disconnect the internet, to ensure model is local, which, through the wave chatbot apparently, it is not. The ai widget does work this way however.

Have a config file for { "ai@ollama-llama": { "ai:*": true, "ai:apitoken": "ollama", "ai:baseurl": "http://localhost:11434/v1", "ai:model": "qwen3-coder:30b", "ai:name": "qwen3-coder:30b", "display:name": "Ollama - Qwen3", "display:order": 3 }, "autoupdate:channel": "latest" }

Expected Behavior

Expected Ollama to read terminal- I don't know if the ai widget was designed just as a chat bot but I need local llm (BYOLLM) integration with the terminal. I've seen on the site this working before https://legacydocs.waveterm.dev/features/waveAI I know this is a legacy doc.

Image

if this feature no longer exists, I'd like to know if and how I can download an older variant that works. Please, and thank you!

Steps To Reproduce

Ollama download, with qwenstarcoder:30b. Config file listed. Opened ai widget and changed from gpt5mini to Ollama and confirmed this works.

Other steps I have tried to point the chatbot wave ai ui have broken the program ( I would list what I've done, but I admittedly don't know what I've done- I used ai and threw its suggestions at the wall until it broke the app, so I reinstalled it fresh. upon request I can do more research into steps taken but because those steps have been undone, I believe they are not relevant- they were bad suggestions from the ai and could be wild goose chase in nature.

Wave Version

Client Version 0.12.1 (202510210632) Update Channel: latest

Platform

macOS

OS Version/Distribution

Sonoma

Architecture

x64

Anything else?

These details are fun highlights but should not be relevant to problem: running a Mac Pro 2013 with open core, 128gb of ram, launching the wave browser through terminal apple shortcut using open gl because d700s can't run latest metal in Sonoma.

Questionnaire

  • [x] I'm interested in fixing this myself but don't know where to start
  • [x] I would like to fix and I have a solution
  • [ ] I don't have time to fix this right now, but maybe later

will-wrigh avatar Oct 25 '25 23:10 will-wrigh

Ah that is working as expected. the new "Wave AI" side panel currently does not allow for local models or BYOK while it is is in beta. I only have an OpenAI "responses" API set up right now, which is not compatible with the "completions" api that we were using before. I also don't have adapters set up for other APIs yet either. This is all in the works though and will be coming out with future patch releases over the next couple of weeks.

So right now we're between worlds. the old AI widget does talk to Ollama but can't use tools, and the new Wave AI panel can use tools but can't talk to Ollama.

sawka avatar Oct 27 '25 21:10 sawka