langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Add Compatibility for Open-Source Models like DeepSeek, Llama, and Others

Open Rajagopal143 opened this issue 10 months ago • 5 comments

Feature Request

Provide a compatible API layer that allows requests to be routed to OpenAI models or user-hosted models (e.g., DeepSeek via Ollama, Llama via vLLM, etc.). Enable format compatibility so that requests to OpenAI (/v1/chat/completions) can be similarly structured when calling external models. Offer custom model endpoints, allowing developers to define third-party endpoints in the OpenAI SDK without complex rewrites.

Motivation

Introduce a custom_models config in the OpenAI SDK, where developers can define their own LLM endpoints. Add an API passthrough mode that formats responses from external models to match OpenAI’s response schema. Support WebSocket or streaming responses from external models to ensure real-time interactions remain smooth.

Your Contribution

No response

Rajagopal143 avatar Feb 14 '25 11:02 Rajagopal143

+1

sliontc avatar Feb 17 '25 10:02 sliontc

+1

keyangzhen avatar Feb 26 '25 15:02 keyangzhen

+1

ZYMCao avatar Mar 18 '25 06:03 ZYMCao

+1

wicon2021 avatar Apr 17 '25 09:04 wicon2021

+1

tilakvijay avatar May 27 '25 12:05 tilakvijay

Hi @Rajagopal143,

Thank you for this comprehensive feature request! I have excellent news - Langflow already supports many of the capabilities you've requested, and I can guide you through the current options available.

Current OpenAI-Compatible Model Support

Langflow currently provides robust support for open-source models through several pathways:

1. OpenAI Component with Custom Base URLs ✅

The OpenAI component already supports custom endpoints through the OpenAI API Base field:

  • Location: Advanced settings in OpenAI component
  • Field: "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1. You can change this to use other APIs like JinaChat, LocalAI and Prem."
  • Support: Works with any OpenAI-compatible API including vLLM, LocalAI, JinaChat, and Prem
  • Custom Models: The Model Name field is a combobox (not just dropdown) - you can type custom model names
  • JSON Mode: Built-in JSON mode support for structured outputs
  • Streaming: Full WebSocket/streaming support maintained

2. Dedicated Open-Source Components ✅

Langflow includes purpose-built components for popular open-source models:

DeepSeek Component:

  • Native DeepSeek API integration with configurable base URL
  • JSON mode enabled
  • Dynamic model discovery from /v1/models endpoint

Ollama Component:

  • Full Ollama integration with local model support
  • Automatic model detection and comprehensive parameter control
  • Custom base URL configuration
  • Advanced parameters (mirostat, context window, GPU settings, etc.)

LM Studio Component:

  • Native LM Studio support with automatic model discovery
  • Real-time model list updates from local endpoints
  • OpenAI-compatible API integration

3. Extensive Provider Ecosystem ✅

Langflow includes 15+ model provider components:

  • Major Providers: Groq, Mistral, Anthropic, HuggingFace
  • Cloud Services: Google Gemini, Azure OpenAI, VertexAI
  • Specialized: Cohere, XAI (Grok), Nvidia, Perplexity
  • Enterprise: SambaNova, Novita
  • All support custom base URLs where applicable

How to Use Custom Models Right Now

For vLLM/Custom OpenAI-Compatible Endpoints:

  1. Use the OpenAI component
  2. Set OpenAI API Base to your vLLM endpoint (e.g., http://localhost:8000/v1)
  3. Type your custom model name in the Model Name field (it's a combobox - accepts any value)
  4. Provide your API key or leave empty if not required
  5. Enable JSON Mode if needed for structured outputs

For Ollama/Local Models:

  1. Use the Ollama component
  2. Set Base URL to your Ollama endpoint
  3. Component automatically discovers available models
  4. Full streaming and tool calling support

For Other Providers:

Most provider components include custom base URL configuration and combobox model selection.

Community Interest

Your request has significant community support with 5 +1 votes from:

  • @sliontc, @keyangzhen, @ZYMCao, @wicon2021, @tilakvijay

This suggests many users need better documentation about existing capabilities.

Related Work

  • Issue #6096 (CLOSED): "Support Custom OpenAI-compatible Models" - This was implemented
  • Issue #4664 (OPEN): "OpenAI-Like API support" - For exposing Langflow as OpenAI-compatible endpoints

Please let me know if you'd like help setting up any specific model configuration!

Best regards

Vigtu avatar Aug 26 '25 19:08 Vigtu

Try this code if you are using the OpenAI Completion API endpoint. https://github.com/langflow-ai/langflow/issues/3452#issuecomment-3284491439

mizuikk avatar Sep 12 '25 09:09 mizuikk