langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Please add OpenRouter API

Open merlinxdyang opened this issue 1 year ago • 5 comments

Feature Request

OpenRouter API is very easy to handle for non-professional users. Please seriously consider it and add it to Langflow. It would be very helpful. Thank you!

Motivation

As an amature in computer science but needing AI technologies, openrouter api provides me very simple and easy-to-handle experiences in many cases. I am learning langflow now and I feel that if openrouter api can be included, it would be very helpful to people like me. Thank you.

Your Contribution

No response

merlinxdyang avatar Aug 05 '24 13:08 merlinxdyang

I think it may be added in the near future for sure.

Have you tried the AI/ML component?

It doesn't do exactly what Open Router does, but you have the ability to select any model there from a single component and API Key, very easy to test different models.

image

I'm not sure exactly what you mean by OpenRouter being "easy-to-handle experiences", if you could, please provide more examples on why is that the case.

vasconceloscezar avatar Aug 06 '24 21:08 vasconceloscezar

Thank you so much for your reply. Maybe my expression is not accurate. Forgive me for my broken ENglish. I have tried AI/ML api, but I am afraid it is much cheaper to use OpenRouter, which is a pre-paid service to use all models including gpt-4o families. The subscription to use them at $4.99 per week for AI/ML api is somehow out of my budget, since I don't use it everyday in a month. "easy-to-handle experiences" here I mean I don't worry how frequent and how much data I use by using OpenRouter api. I deposit $20 and it may work for two or three months or even longer. Surely $4.99 per week (or $20 per month) for AI/ML is only a cup of coffee to most people, but not for me. Anyway, I deeply appreaciate your reply and consideration. I can use the open sourced models instead. Thank you again.

merlinxdyang avatar Aug 06 '24 22:08 merlinxdyang

It seems that the site has done a good job with categorization. However, the problem is that costs and logging may not be properly tracked by tracers like Langsmith. I will consider adding this if it gets integrated into Langchain in the future.

YamonBot avatar Aug 08 '24 05:08 YamonBot

Hey @merlinxdyang

According to their documentation all you have to do is setup the openai_api_base to their URL and pass your OpenRouter API key.

So, you should be able to use the OpenAI component for this.

ogabrielluiz avatar Aug 08 '24 20:08 ogabrielluiz

Thank you YamonBot and ogabrielluiz, I will try to modify openai_api_base to openrouter URL. Sorry that I am totally a green hand in dealing with codes and try to find my feet.

merlinxdyang avatar Aug 08 '24 23:08 merlinxdyang

Thank you YamonBot and ogabrielluiz, I will try to modify openai_api_base to openrouter URL. Sorry that I am totally a green hand in dealing with codes and try to find my feet.

where? thanks you Captura de pantalla 2024-08-16 a la(s) 17 36 36

bambanx avatar Aug 16 '24 21:08 bambanx

now works with any of openrouter i changed the name here : Captura de pantalla 2024-08-16 a la(s) 18 09 19

bambanx avatar Aug 16 '24 22:08 bambanx

Open the advanced parameters and you'll find the openai_api_base

ogabrielluiz avatar Aug 16 '24 22:08 ogabrielluiz

Thank you so much for sharing these!

merlinxdyang avatar Aug 17 '24 05:08 merlinxdyang

@merlinxdyang

Do you need any assistance with this case? If not, please let us know if this issue can be closed.

carlosrcoelho avatar Aug 21 '24 23:08 carlosrcoelho

Hi, i also tried to configure Open Router API using the OpenAI component.

I have set the Open Router API Key in the OpenAI API Key field and hardcoded model name and base url in the code section: image Unfortunately I get this error: image It seems to me like it is either still trying to access OpenAI API or not considering the hardcoded model name.

Direct access to Open Router API works like this: image

Do you have any tips? I'm running langflow via docker using the latest image.

Thank you!

loyzious avatar Oct 11 '24 18:10 loyzious

i figured it out, it now works for me like this: image

alois-nejo avatar Oct 29 '24 14:10 alois-nejo

please add official support

jtoy avatar Nov 01 '24 13:11 jtoy

OpenRouter is an excellent service that allows you to use many LLMs through a single API. Its pay-per-token billing model is particularly attractive. While there seems to be another service called AIML, it appears to only offer weekly subscription plans.

Based on the previous discussion, I was able to successfully test making the built-in OpenAI LLM node function as an OpenRouter LLM node. To further develop this, I'm attaching below the actual working code that was rewritten for OpenRouter by Claude 3.5 Sonnet. Since I had Claude 3.5 Sonnet write it in a way that updates the model list based on OpenRouter's GET models response, I believe it can utilize all LLM models available on OpenRouter.

To use this code, add a custom component node from "New Custom Component" in the bottom left of Langflow, open the code editor with <> code, and paste the code.

As I'm not an engineer, I cannot speak to the code quality. Since OpenRouter is convenient, I hope someone will properly rewrite it as a built-in node and submit a pull request for implementation.

import operator
from functools import reduce
import requests
from typing import List, Optional
import logging

from langchain_openai import ChatOpenAI
from pydantic.v1 import SecretStr

from langflow.base.models.model import LCModelComponent
from langflow.field_typing import LanguageModel
from langflow.field_typing.range_spec import RangeSpec
from langflow.inputs import BoolInput, DictInput, DropdownInput, FloatInput, IntInput, SecretStrInput, StrInput
from langflow.inputs.inputs import HandleInput

logger = logging.getLogger(__name__)

def fetch_openrouter_models(api_key: Optional[str] = None) -> List[str]:
    """Get a list of models from the Openrouter API"""
    try:
        headers = {
            'accept': 'application/json'
        }
        if api_key:
            headers['Authorization'] = f'Bearer {api_key}'

        response = requests.get(
            'https://openrouter.ai/api/v1/models',
            headers=headers,
            timeout=10
        )
        response.raise_for_status()
        
        models_data = response.json().get('data', [])
        model_ids = [model['id'] for model in models_data]
        
        return sorted(model_ids)
    except Exception as e:
        logger.warning(f"Failed to fetch Openrouter models: {str(e)}")
        return [
            "anthropic/claude-3.5-haiku",
            "anthropic/claude-3.5-sonnet",
            "qwen/qwen-2.5-coder-32b-instruct",
            "qwen/qwq-32b-preview",
            "qwen/qwen-2.5-72b-instruct",
            "google/gemini-flash-1.5",
            "google/gemini-pro-1.5",
            "openai/o1",
            "openai/o1-preview",
            "openai/o1-mini",
            "openai/gpt-4o",
            "openai/gpt-4o-mini"
        ]

class OpenrouterModelComponent(LCModelComponent):
    display_name = "Openrouter"
    description = "Generates text using Openrouter's various LLM models."
    icon = ""
    name = "OpenrouterModel"

    # Use the default model list as the initial value
    default_models = fetch_openrouter_models()

    inputs = [
        *LCModelComponent._base_inputs,
        IntInput(
            name="max_tokens",
            display_name="Max Tokens",
            advanced=True,
            info="The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
            range_spec=RangeSpec(min=0, max=128000),
        ),
        DictInput(
            name="model_kwargs",
            display_name="Model Kwargs",
            advanced=True,
            info="Additional keyword arguments to pass to the model.",
        ),
        BoolInput(
            name="json_mode",
            display_name="JSON Mode",
            advanced=True,
            info="If True, it will output JSON regardless of passing a schema.",
        ),
        DictInput(
            name="output_schema",
            is_list=True,
            display_name="Schema",
            advanced=True,
            info="The schema for the Output of the model. "
            "You must pass the word JSON in the prompt. "
            "If left blank, JSON mode will be disabled.",
        ),
        DropdownInput(
            name="model_name",
            display_name="Model Name",
            advanced=False,
            options=default_models,
            value=default_models[0] if default_models else None,
        ),
        SecretStrInput(
            name="api_key",
            display_name="Openrouter API Key",
            info="The Openrouter API Key",
            advanced=False,
            value="OPENROUTER_API_KEY",
        ),
        StrInput(
            name="http_referer",
            display_name="HTTP Referer",
            info="Your site URL for Openrouter rankings",
            advanced=True,
        ),
        StrInput(
            name="x_title",
            display_name="X-Title",
            info="Your app name for Openrouter rankings",
            advanced=True,
        ),
        FloatInput(
            name="temperature",
            display_name="Temperature",
            value=0.1
        ),
        IntInput(
            name="seed",
            display_name="Seed",
            info="The seed controls the reproducibility of the job.",
            advanced=True,
            value=1,
        ),
        HandleInput(
            name="output_parser",
            display_name="Output Parser",
            info="The parser to use to parse the output of the model",
            advanced=True,
            input_types=["OutputParser"],
        ),
    ]

    def build_model(self) -> LanguageModel:
        output_schema_dict: dict[str, str] = reduce(operator.ior, self.output_schema or {}, {})
        api_key = SecretStr(self.api_key).get_secret_value() if self.api_key else None

        # If an API key is set, update the model list.
        if api_key:
            try:
                updated_models = fetch_openrouter_models(api_key)
                # Update the dropdown choices
                for input_field in self.inputs:
                    if input_field.name == "model_name":
                        input_field.options = updated_models
                        if not self.model_name or self.model_name not in updated_models:
                            self.model_name = updated_models[0]
                        break
            except Exception as e:
                logger.warning(f"Failed to update model list: {str(e)}")

        # Setting additional headers
        extra_headers = {}
        if self.http_referer:
            extra_headers["HTTP-Referer"] = self.http_referer
        if self.x_title:
            extra_headers["X-Title"] = self.x_title

        # Model Setup
        output = ChatOpenAI(
            max_tokens=self.max_tokens or None,
            model_kwargs=self.model_kwargs or {},
            model=self.model_name,
            base_url="https://openrouter.ai/api/v1",
            api_key=api_key,
            temperature=self.temperature if self.temperature is not None else 0.1,
            seed=self.seed,
            default_headers=extra_headers,
        )

        # JSON Mode Configuration
        if output_schema_dict or self.json_mode:
            if output_schema_dict:
                output = output.with_structured_output(schema=output_schema_dict, method="json_mode")
            else:
                output = output.bind(response_format={"type": "json_object"})

        return output

    def _get_exception_message(self, e: Exception):
        try:
            from openai import BadRequestError
        except ImportError:
            return None
        if isinstance(e, BadRequestError):
            message = e.body.get("message")
            if message:
                return message
        return None

WarriorMama777 avatar Dec 28 '24 08:12 WarriorMama777