langserve
langserve copied to clipboard
LangServe 🦜️🏓
chat langchain returning ChatPromptValue rather than ChatPromptValueConcrete determine how to make sure that we use chat prompt value concrete everywhere otherwise we'll end up with serialization issue
Can the stream interface exposed by langserve provide an interface for gracefully aborting requests?
just as title, our client calls the server-side code of langserve that I wrote. They have a requirement to frequently interrupt a certain stream request and, after interruption, they hope...
Agent's don't stream correctly in playground:  https://github.com/langchain-ai/langserve/blob/main/examples/agent/server.py Raised here: https://github.com/langchain-ai/langserve/issues/314
Add more informative error if `APIHandler` receives a non runnable object.
Feedback should be following upsert logic for a given run rather than creating new feedback every time.
When using add_routes users are sometimes passing non runnable objects
I experimented with a use case in which I initialize an AgentExecutor with an `agent` chain that is a RemoteRunnable. i.e., the client side looks like this: ```python from langchain.agents...
Currently langserve[client] fails complaining that fastapi is not installed. We need to add tests to catch this on CI otherwise it's difficult to make sure that the dependencies actually work...