Support many model providers & local models through litellm
First of all, this is a very nice and promising project! Thanks for building and sharing it.
I think it'd be really valuable to add support for more model providers and local llms. This could be done fairly easily by using litellm (https://github.com/BerriAI/litellm) as an option for the LLM backend. Most frameworks support litellm for a good reason.
Whether or not litellm is used, it would be great to have support for Anthropic's models, including Claude 3.7 Sonnet which is often considered best for coding and tool use. Support for other frontier models such as Gemini 2.5 Pro would be great too.
yep definitely +1 on this!
I use CrewAI and use many LLMs with their agents so this would be a very useful feature, especially when we can't use foundation LLMs at all owing to policies etc ;-)
+1 for LiteLLM support
Support for LiteLLM and OpenRouter coming in next week! Stay tuned for our upcoming announcement on discord :)
Assuming LiteLLM would allow for easy use of Ollama?
Assuming LiteLLM would allow for easy use of Ollama?
You can check on the linked documentation, but the answer is yes
This was completed here: https://github.com/rowboatlabs/rowboat/pull/92
See our announcement on discord for more details