bytebot
bytebot copied to clipboard
(Feature Request) Add OpenRouter integration
Please add OpenRouter integration to allow using multiple AI models through a single API. This will make it easier to switch models and manage costs within Bytebot.
Have you tied using the lite-llm proxy version? I have tried adding a openai custom endpoint alternative but no luck. Some guides in there docs would be very helpful.
noone else uses this? why 7 rockets...
I have it working with litellm built-in proxy. You'll have to compile the list of vision models from openrouter and add them to litellm config.
bytebot_lite_llm_open_router_option_a_proxy_compose_runbook.md
Your litellm config should look something like this:
- model_name: grok-2-vision-1212 litellm_params: model: openrouter/x-ai/grok-2-vision-1212 api_key: os.environ/OPENROUTER_API_KEY
- model_name: llama-3.2-90b-vision-instruct litellm_params: model: openrouter/meta-llama/llama-3.2-90b-vision-instruct api_key: os.environ/OPENROUTER_API_KEY
- model_name: llama-vision-11b-free litellm_params: model: openrouter/meta-llama/llama-3.2-11b-vision-instruct:free api_key: os.environ/OPENROUTER_API_KEY
- model_name: qwen-72b-vision-free litellm_params: model: openrouter/qwen/qwen2.5-vl-72b-instruct:free api_key: os.environ/OPENROUTER_API_KEY
- model_name: gemini-2.5-flash-image-preview litellm_params: model: openrouter/google/gemini-2.5-flash-image-preview api_key: os.environ/OPENROUTER_API_KEY
Hey @zhound420 , do you know how to add a custom openai compatible endpoint such as venice.ai or w/e?
how to add a custom openai compatible endpoint such as venice.ai or w/e
You'll need to read up on litellm:
# packages/bytebot-llm-proxy/litellm-config.yaml
model_list:
# --- Venice.ai example (OpenAI-compatible) ---
- model_name: venice-coder-32b # <alias you’ll see in Bytebot>
litellm_params:
model: openai/qwen2.5-coder-32b # <remote vendor model id, prefixed with openai/>
api_base: https://api.venice.ai/api/v1
api_key: os.environ/VENICE_API_KEY # pull from env
# optional tunables:
# request_timeout: 600
# supports_vision: true
You'd think this would already be included with most projects
I shared my configuration for OpenRouter here:
https://github.com/bytebot-ai/bytebot/issues/144#issuecomment-3350817420