langserve
langserve copied to clipboard
Can't get chat playground_type to work
I like the look of the newer chat playground_type, but I can't seem to get it work. When I try send a chat message the UI pops up the following error and the request (exported as curl below) looks malformed.
The error:
Expected content-type to be text/event-stream, Actual: application/json Check your backend logs for errors.
The malformed looking request:
curl 'http://0.0.0.0:8010/llm/Claude3Sonnet/stream_log' \
-H 'Accept-Language: en-GB,en-US;q=0.9,en;q=0.8' \
-H 'Connection: keep-alive' \
-H 'Content-Type: application/json' \
-H 'Origin: http://0.0.0.0:8080' \
-H 'Referer: http://0.0.0.0:8080/llm/chat/playground' \
-H 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36' \
-H 'accept: text/event-stream' \
--data-raw '{"input":{"undefined":[{"type":"human","content":"hello, who are you?"}]},"config":{}}' \
--insecure
Minimal app to reproduce the issue:
from fastapi import FastAPI
from langserve.server import add_routes
from langchain_openai import ChatOpenAI
chain = ChatOpenAI(model="gpt-3.5-turbo")
app = FastAPI()
add_routes(app, chain, path="/chat", playground_type="chat")
Full project for this app: langserve-test.zip
I'm using python 3.10 on macOS, latest stable release of each dependency.
Can use this as a work-around for now: https://github.com/langchain-ai/langserve/blob/main/examples/chat_playground/server.py
We'll need to investigate, but we likely doing some gymantics with input and output schema and in the process forgot to support the simplest case of just wrapping a chat model
I'm having the same issue with the popup:
Expected content-type to be text/event-stream, Actual: application/json Check your backend logs for errors.
I don't quite understand what workaround was supposed to be used in the script example given. My chain is a LangGraph workflow that has a string as input and output.
Even I am having the same issue I am trying it with a local Ollama model
I'm having the same issue with the popup:
Expected content-type to be text/event-stream, Actual: application/json Check your backend logs for errors.
I don't quite understand what workaround was supposed to be used in the script example given. My chain is a LangGraph workflow that has a string as input and output.
After looking at the FastAPI Swagger and playing with some Curl request, I realized what my specific issue was. The type expected for the stream_log endpoint is different when you switch from normal playground mode to the chat mode. The client sends out a dict when the request is triggered. After changing the entry type to a dict (instead of a string) on my chain, it works normally.
add_routes(
app,
custom_chain.with_types(input_type=dict,output_type=str),
# custom_chain.with_types(input_type=str,output_type=str), # endpoint type change for default playground
path="/local_assistant",
playground_type="chat",
)