Please add OpenRouter API
Feature Request
OpenRouter API is very easy to handle for non-professional users. Please seriously consider it and add it to Langflow. It would be very helpful. Thank you!
Motivation
As an amature in computer science but needing AI technologies, openrouter api provides me very simple and easy-to-handle experiences in many cases. I am learning langflow now and I feel that if openrouter api can be included, it would be very helpful to people like me. Thank you.
Your Contribution
No response
I think it may be added in the near future for sure.
Have you tried the AI/ML component?
It doesn't do exactly what Open Router does, but you have the ability to select any model there from a single component and API Key, very easy to test different models.
I'm not sure exactly what you mean by OpenRouter being "easy-to-handle experiences", if you could, please provide more examples on why is that the case.
Thank you so much for your reply. Maybe my expression is not accurate. Forgive me for my broken ENglish. I have tried AI/ML api, but I am afraid it is much cheaper to use OpenRouter, which is a pre-paid service to use all models including gpt-4o families. The subscription to use them at $4.99 per week for AI/ML api is somehow out of my budget, since I don't use it everyday in a month. "easy-to-handle experiences" here I mean I don't worry how frequent and how much data I use by using OpenRouter api. I deposit $20 and it may work for two or three months or even longer. Surely $4.99 per week (or $20 per month) for AI/ML is only a cup of coffee to most people, but not for me. Anyway, I deeply appreaciate your reply and consideration. I can use the open sourced models instead. Thank you again.
It seems that the site has done a good job with categorization. However, the problem is that costs and logging may not be properly tracked by tracers like Langsmith. I will consider adding this if it gets integrated into Langchain in the future.
Hey @merlinxdyang
According to their documentation all you have to do is setup the openai_api_base to their URL and pass your OpenRouter API key.
So, you should be able to use the OpenAI component for this.
Thank you YamonBot and ogabrielluiz, I will try to modify openai_api_base to openrouter URL. Sorry that I am totally a green hand in dealing with codes and try to find my feet.
Thank you YamonBot and ogabrielluiz, I will try to modify openai_api_base to openrouter URL. Sorry that I am totally a green hand in dealing with codes and try to find my feet.
where? thanks you
now works with any of openrouter i changed the name here :
Open the advanced parameters and you'll find the openai_api_base
Thank you so much for sharing these!
@merlinxdyang
Do you need any assistance with this case? If not, please let us know if this issue can be closed.
Hi, i also tried to configure Open Router API using the OpenAI component.
I have set the Open Router API Key in the OpenAI API Key field and hardcoded model name and base url in the code section:
Unfortunately I get this error:
It seems to me like it is either still trying to access OpenAI API or not considering the hardcoded model name.
Direct access to Open Router API works like this:
Do you have any tips? I'm running langflow via docker using the latest image.
Thank you!
i figured it out, it now works for me like this:
please add official support
OpenRouter is an excellent service that allows you to use many LLMs through a single API. Its pay-per-token billing model is particularly attractive. While there seems to be another service called AIML, it appears to only offer weekly subscription plans.
Based on the previous discussion, I was able to successfully test making the built-in OpenAI LLM node function as an OpenRouter LLM node. To further develop this, I'm attaching below the actual working code that was rewritten for OpenRouter by Claude 3.5 Sonnet. Since I had Claude 3.5 Sonnet write it in a way that updates the model list based on OpenRouter's GET models response, I believe it can utilize all LLM models available on OpenRouter.
To use this code, add a custom component node from "New Custom Component" in the bottom left of Langflow, open the code editor with <> code, and paste the code.
As I'm not an engineer, I cannot speak to the code quality. Since OpenRouter is convenient, I hope someone will properly rewrite it as a built-in node and submit a pull request for implementation.

import operator
from functools import reduce
import requests
from typing import List, Optional
import logging
from langchain_openai import ChatOpenAI
from pydantic.v1 import SecretStr
from langflow.base.models.model import LCModelComponent
from langflow.field_typing import LanguageModel
from langflow.field_typing.range_spec import RangeSpec
from langflow.inputs import BoolInput, DictInput, DropdownInput, FloatInput, IntInput, SecretStrInput, StrInput
from langflow.inputs.inputs import HandleInput
logger = logging.getLogger(__name__)
def fetch_openrouter_models(api_key: Optional[str] = None) -> List[str]:
"""Get a list of models from the Openrouter API"""
try:
headers = {
'accept': 'application/json'
}
if api_key:
headers['Authorization'] = f'Bearer {api_key}'
response = requests.get(
'https://openrouter.ai/api/v1/models',
headers=headers,
timeout=10
)
response.raise_for_status()
models_data = response.json().get('data', [])
model_ids = [model['id'] for model in models_data]
return sorted(model_ids)
except Exception as e:
logger.warning(f"Failed to fetch Openrouter models: {str(e)}")
return [
"anthropic/claude-3.5-haiku",
"anthropic/claude-3.5-sonnet",
"qwen/qwen-2.5-coder-32b-instruct",
"qwen/qwq-32b-preview",
"qwen/qwen-2.5-72b-instruct",
"google/gemini-flash-1.5",
"google/gemini-pro-1.5",
"openai/o1",
"openai/o1-preview",
"openai/o1-mini",
"openai/gpt-4o",
"openai/gpt-4o-mini"
]
class OpenrouterModelComponent(LCModelComponent):
display_name = "Openrouter"
description = "Generates text using Openrouter's various LLM models."
icon = ""
name = "OpenrouterModel"
# Use the default model list as the initial value
default_models = fetch_openrouter_models()
inputs = [
*LCModelComponent._base_inputs,
IntInput(
name="max_tokens",
display_name="Max Tokens",
advanced=True,
info="The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
range_spec=RangeSpec(min=0, max=128000),
),
DictInput(
name="model_kwargs",
display_name="Model Kwargs",
advanced=True,
info="Additional keyword arguments to pass to the model.",
),
BoolInput(
name="json_mode",
display_name="JSON Mode",
advanced=True,
info="If True, it will output JSON regardless of passing a schema.",
),
DictInput(
name="output_schema",
is_list=True,
display_name="Schema",
advanced=True,
info="The schema for the Output of the model. "
"You must pass the word JSON in the prompt. "
"If left blank, JSON mode will be disabled.",
),
DropdownInput(
name="model_name",
display_name="Model Name",
advanced=False,
options=default_models,
value=default_models[0] if default_models else None,
),
SecretStrInput(
name="api_key",
display_name="Openrouter API Key",
info="The Openrouter API Key",
advanced=False,
value="OPENROUTER_API_KEY",
),
StrInput(
name="http_referer",
display_name="HTTP Referer",
info="Your site URL for Openrouter rankings",
advanced=True,
),
StrInput(
name="x_title",
display_name="X-Title",
info="Your app name for Openrouter rankings",
advanced=True,
),
FloatInput(
name="temperature",
display_name="Temperature",
value=0.1
),
IntInput(
name="seed",
display_name="Seed",
info="The seed controls the reproducibility of the job.",
advanced=True,
value=1,
),
HandleInput(
name="output_parser",
display_name="Output Parser",
info="The parser to use to parse the output of the model",
advanced=True,
input_types=["OutputParser"],
),
]
def build_model(self) -> LanguageModel:
output_schema_dict: dict[str, str] = reduce(operator.ior, self.output_schema or {}, {})
api_key = SecretStr(self.api_key).get_secret_value() if self.api_key else None
# If an API key is set, update the model list.
if api_key:
try:
updated_models = fetch_openrouter_models(api_key)
# Update the dropdown choices
for input_field in self.inputs:
if input_field.name == "model_name":
input_field.options = updated_models
if not self.model_name or self.model_name not in updated_models:
self.model_name = updated_models[0]
break
except Exception as e:
logger.warning(f"Failed to update model list: {str(e)}")
# Setting additional headers
extra_headers = {}
if self.http_referer:
extra_headers["HTTP-Referer"] = self.http_referer
if self.x_title:
extra_headers["X-Title"] = self.x_title
# Model Setup
output = ChatOpenAI(
max_tokens=self.max_tokens or None,
model_kwargs=self.model_kwargs or {},
model=self.model_name,
base_url="https://openrouter.ai/api/v1",
api_key=api_key,
temperature=self.temperature if self.temperature is not None else 0.1,
seed=self.seed,
default_headers=extra_headers,
)
# JSON Mode Configuration
if output_schema_dict or self.json_mode:
if output_schema_dict:
output = output.with_structured_output(schema=output_schema_dict, method="json_mode")
else:
output = output.bind(response_format={"type": "json_object"})
return output
def _get_exception_message(self, e: Exception):
try:
from openai import BadRequestError
except ImportError:
return None
if isinstance(e, BadRequestError):
message = e.body.get("message")
if message:
return message
return None