langserve
langserve copied to clipboard
LangServe 🦜️🏓
Based on https://python.langchain.com/docs/modules/agents/how_to/max_time_limit we can set a timeout for agents and a message will be returned to the user, this works in LangChain. However when using this feature with LangServe...
Users reasonably expect that when an agent is configurable, then the configuration information is properly propagated via the Agent Executor. However, this is not the case at the moment. Bug...
- right now I have to add my auth on langserve through middleware - this works in prod, so that's great! - in dev, it makes it so I have...
Add support to playground for auth
ConversationBufferMemory is useful in conversational agents, like codes below ``` memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True) chain = initialize_agent(tools, llm, agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, verbose=True, memory=memory, ) ``` However, when integrated with langserve, it may...
Testing [rag-chroma-private](https://github.com/langchain-ai/langchain/tree/master/templates/rag-chroma-private). for - ``` from langserve.client import RemoteRunnable rag_app = RemoteRunnable('http://0.0.0.0:8001/rag_chroma_private/') ``` this returns a generator - ``` rag_app.stream("How does agent memory work?") ``` and this returns the answer...
LangServe can stream output using `/stream` Endpoint. However, if I want to send an event and wait for human feedback, wait for a parameter value or confirm, for example. It...
Hi, team-langchain, I have an agent that uses memory, user-authentication as well as function calling. I'd like to migrate it to langserve in production but couldn't find anything as complex...
Hello, I'm unable to get an SSE streaming endpoint working correctly when using an Azure-hosted OpenAI model (using the `AzureChatOpenAI` class). I'm using a simple LCEL chain: `chain = promptTemplate...