langserve
langserve copied to clipboard
⚠️ Recommending LangGraph Platform for new projects
We have recently announced LangGraph Platform, a significantly enhanced solution for deploying agentic applications at scale.
We recommend using LangGraph Platform rather than LangServe for new projects.
LangGraph Platform incorporates key design patterns and capabilities essential for production-level deployment of large language model (LLM) applications.
In contrast to LangServe, LangGraph Platform provides comprehensive, out-of-the-box support for persistence, memory, double-texting handling, human-in-the-loop workflows, cron job scheduling, webhooks, high-load management, advanced streaming, support for long-running tasks, background task processing, and much more.
The LangGraph Platform ecosystem includes the following components:
- LangGraph Server: Provides an Assistants API for LLM applications (graphs) built with LangGraph. Available in both Python and JavaScript/TypeScript.
- LangGraph Studio: A specialized IDE for real-time visualization, debugging, and interaction via a graphical interface. Available as a web application or macOS desktop app, it's a substantial improvement over LangServe's playground.
- SDK: Enables programmatic interaction with the server, available in Python and JavaScript/TypeScript.
- RemoteGraph: Allows interaction with a remote graph as if it were running locally, serving as LangGraph's equivalent to LangServe's RemoteRunnable. Available in both Python and JavaScript/TypeScript.
If you're interested in migrating your LangServe code to LangGraph Platform please the LangGraph Platform Migration Guide for more information.
We will continue to accept bug fixes for LangServe from the community; however, we will not be accepting new feature contributions.
My understanding is that LangServe and LangChain are open-source, while LangGraph Server is a commercial product with closed-source dependencies and limitations on free usage (according to the deployment options page). Is that correct?
Yes, it's a commercial offering. The free tier option allows up to 1 million nodes executions and has an option for self hosted deployment (https://github.com/langchain-ai/langgraph/blob/main/docs/docs/concepts/deployment_options.md#self-hosted-lite).
@prise6, @madoe001, @timvw, @aaronvenezia, @laithalsaadoon, @heidar, @teeppp, @cycleuser0x1 would appreciate any feedback on the :-1: .
Is there something that you'd like LangGraph to do that it doesn't but LangServe does?
This news made me sad. My chain doesn't need LangGraph's memory or checkpoints. LangGraph requires Postgresql and Radis, which makes the infrastructure cumbersome. I want to host my Chain with simple infrastructure and add it to the existing FastAPI routes (langserve.add_rouets). I run LangServe with: AWS Lambda + web adapter AWS AppRunner
Updated: In my use case, Model Context Protocol(MCP) might be the destination https://github.com/tadata-org/fastapi_mcp https://github.com/modelcontextprotocol/python-sdk
TBH the news is disappointing because all the new serving features are behind a Langsmith key 🤷
On Sat, Dec 28, 2024, 3:04 AM chai3 @.***> wrote:
This news made me sad. My chain doesn't need LangGraph's memory or checkpoints. LangGraph requires Postgresql and Radis, which makes the infrastructure cumbersome. I want to host my Chain with simple infrastructure and add it to the existing FastAPI routes (langserve.add_rouets). I run LangServe with: AWS Lambda + web adapter AWS AppRunner
— Reply to this email directly, view it on GitHub https://github.com/langchain-ai/langserve/issues/791#issuecomment-2564297354, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACI4QLQL2PTET3NXKU2ECCD2H2AVLAVCNFSM6AAAAABSAAQJHSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRUGI4TOMZVGQ . You are receiving this because you were mentioned.Message ID: @.***>
Feels like a rug pull to me. You built a tool for deployment just to abandon it for a commercial one with your now captive developers that have invested time in building with your tool. I'm fine with commercial products, but (just my opinion) the new offering shouldn't have a limitation when it comes to self deployment. That would make the transition a lot easier to swallow. I understand that's what happens sometimes, it's just a little disappointing.
That said, you guys build a lot of great tools and I very much appreciate all that you provide in the AI framework space. Thank you for all you do.
I'm confused by the deployment documentation. It seems to focus entirely on options through the LangGraph Platform, which is a commercial product with free tier limitations.
I am building a AI agent application with LangGraph and don't know the history before LangGraph Platform.
How can I deploy without LangGraph Platform?
@crawlregister I am also facing a similar issue. Did you find a solution?
Have a look at https://github.com/JoshuaC215/agent-service-toolkit/tree/2e6c6228f3e51311434922953dd2202fcabc5f0c
I've been working on an open-source alternative to LangGraph Platform that addresses these exact concerns.
Agent Protocol Server: https://github.com/ibbybuilds/agent-protocol-server
What it provides:
- ✅ Self-hosted deployment (no commercial dependencies)
- ✅ PostgreSQL persistence
- ✅ Agent Protocol compliance
- ✅ FastAPI-based (fits into existing infrastructure)
- ✅ Zero vendor lock-in
- ✅ Custom authentication support
Why I built it:
- LangGraph Platform's self-hosted tier is limited (no custom auth)
- SaaS pricing is expensive for production use
- Community needs a truly open-source deployment solution
Current status: MVP ready, working on production hardening. Looking for contributors to help shape the roadmap.
This gives developers an alternative that doesn't require LangGraph Platform while maintaining compatibility with LangGraph workflows.