[Feature]: new WaveAI to support other AI engines/local LLMs
Feature description
seems from the docs that only openai is supported. will other LLMs, including local models, be supported.
Implementation Suggestion
No response
Anything else?
No response
Yes that is the intention. I have an Anthropic adapter maybe 80% done. I ended up going with OpenAI for the launch because they had a nice downgrade path from gpt-5 => gpt-5-mini. But, hoping to get the anthropic implementation done (especially for coding tasks).
I also hope to get OpenAI compatible support as well which should cover most local models (although that's a different API -- the "completions" API vs the new one i'm using which is "responses"). What I've noticed though is that different models don't always play nice with the same prompt. Anthropic vs OpenAI needed different prompting to use the tools and format output in a consistent way. For some less powerful models, they also might not be as good at running the tools (or using vision APIs). That's the reason why I'm trying to be careful with the rollout and just started with OpenAI.