deep-chat
deep-chat copied to clipboard
Response type should support multiple messages
I am using a custom chat endpoint where the request is handled by a group of agents, and therefore, the response comes as a list of messages. It looks like DeepChat is only prepared to handle a single message in the Response: https://deepchat.dev/docs/connect/#Response
It would be great if this assumption can be relaxed.
I would also like this, but in case you haven't seen it, there's a workaround using the websocket handler.
@mchill Interesting but this is going to be tricky for me as my endpoint is served by ReactPy in FastAPI so I actually have no easy way to create Javascript objects. This is my UI endpoint in pure Python:
@component
def Agent():
return DeepChat({
"request": {"url": "http://localhost:8000/chat"}
})
Hey folks, apologies for the late replies as I am currently very occupied with my current job.
I wanted to offer an alternative, you could potentially intercept the incoming responses using the responseInterceptor
, parse the content, and use a method called _addMessage
multiple times instead. This method is not part of our official documentation and will be added to the main API on our next release, however it is still available in the current release. The way it works is it accepts a Response object as an argument and adds a message to the chat. E.g.
deepChatRef._addMessage({text: 'hello'});
Let me know if this helps. Thanks!
@OvidijusParsiunas There is a gotcha with responseInterceptor
. It is still expected to return a valid Response
object which ends up repeating one of the messages. Returning null from responseInterceptor
is treated as an Error.
<html>
<head>
<script src="https://unpkg.com/[email protected]/dist/deepChat.bundle.js" type="module" crossorigin></script>
</head>
<body>
<deep-chat id="chat" demo="true" request='{"url": "/chat"}' stream='{"simulation": 6}'></deep-chat>
<script>
window.onload = function() {
let chatElementRef = document.getElementById('chat');
chatElementRef.responseInterceptor = (response) => {
console.log(response); // printed above
response.forEach(m => chatElementRef._addMessage(m))
return response[0];
};
};
</script>
</body>
</html>
I am now calling _addMessage
for all messages except the last one and returning the last one from the responseInterceptor.
This works:
[
{
"text": "Cathy, tell me a joke.",
"role": "joe"
},
{
"text": "Sure, here's one for you:\n\nWhy don't scientists trust atoms?\n\nBecause they make up everything!",
"role": "cathy"
}
]
window.onload = function() {
let chatElementRef = document.getElementById('chat');
chatElementRef.responseInterceptor = (response) => {
console.log(JSON.stringify(response, null, 2));
response.slice(0, response.length - 1).forEach(m => chatElementRef._addMessage(m))
return response[response.length - 1];
};
};
Now there is just a minor glitch that the role
from the last message is not respected.
Hi @nileshtrivedi.
It is strange that the returned role is not respected.
Could you perhaps make the last message returned by the interceptor define an explicit role?
E.g. return {role: 'your-role-name', text: response[response.length - 1].text}
@OvidijusParsiunas That does not fix it.
Here is the whole app in a single file. You can run this as OPENAI_API_KEY=api_key fastapi dev app.py
after pip install autogen fastapi fastapi-cli
:
import os
import json
import tempfile
from typing import Annotated, Literal
from pathlib import Path
from autogen import ConversableAgent, register_function
from autogen.coding import LocalCommandLineCodeExecutor
from fastapi import FastAPI, Body, Request, WebSocket
from fastapi.responses import HTMLResponse
from fastapi.staticfiles import StaticFiles
# from pydantic import BaseModel, List
# from prefect import flow
openai_api_key = os.environ.get("OPENAI_API_KEY")
cathy = ConversableAgent(
"cathy",
system_message="Your name is Cathy and you are a part of a duo of comedians.",
llm_config={"config_list": [{"model": "gpt-4", "temperature": 0.9, "api_key": openai_api_key}]},
human_input_mode="NEVER", # Never ask for human input.
)
joe = ConversableAgent(
"joe",
system_message="Your name is Joe and you are a part of a duo of comedians.",
llm_config={"config_list": [{"model": "gpt-4", "temperature": 0.7, "api_key": openai_api_key}]},
human_input_mode="NEVER", # Never ask for human input.
)
app = FastAPI()
@app.get("/", response_class=HTMLResponse)
async def root():
homepage_html = """
<html>
<head>
<script src="https://unpkg.com/[email protected]/dist/deepChat.bundle.js" type="module" crossorigin></script>
</head>
<body>
<deep-chat id="chat" demo="true" request='{"url": "/chat"}' stream='{"simulation": 6}' names="true"></deep-chat>
<script>
window.onload = function() {
let chatElementRef = document.getElementById('chat');
chatElementRef.responseInterceptor = (response) => {
console.log(JSON.stringify(response, null, 2));
response.slice(0, response.length - 1).forEach(m => chatElementRef._addMessage(m))
console.log(response[response.length - 1])
return {role: "cathy", text: response[response.length - 1].text};
};
};
</script>
</body>
</html>
"""
return homepage_html
@app.post("/chat")
async def chat(request: Request):
data = await request.json()
user_msg = data["messages"][0]["text"]
result = joe.initiate_chat(cathy, message="Cathy, tell me a joke.", max_turns=1)
return [{"text": m["content"], "role": ("joe" if m["role"] == "assistant" else "cathy")} for m in result.chat_history]
You can try this, it will correctly identify the identity. This is the document about this parameter. https://deepchat.dev/docs/messages/#names In addition, you need to delete "stream={" simulation ": 6} '" because it is used to simulate AI streaming response, it is only debugging code, and does not consider this special use case. The role parameter is ignored and fixed to the default' ai '
<deep-chat id="chat" demo="true" names='{"user":{"text":"cathy"}}'></deep-chat>
<script>
window.onload = function() {
let chatElementRef = document.getElementById('chat');
chatElementRef.responseInterceptor = (response) => {
response = [
{
"text": "Cathy, tell me a joke.",
"role": "joe"
},
{
"text": "Sure, here's one for you:\n\nWhy don't scientists trust atoms?\n\nBecause they make up everything!",
"role": "user"
}
]
console.log(JSON.stringify(response, null, 2));
response.slice(0, response.length - 1).forEach(m => chatElementRef._addMessage(m))
console.log(response[response.length - 1])
return response[response.length - 1];
};
};
</script>
@buzhou9 Yes, deleting "stream={" simulation ": 6} " indeed fixes the names. Thanks again to both you and @OvidijusParsiunas ! 👍