cloud-pipeline
cloud-pipeline copied to clipboard
Backend for Chatbot
Background A new service shall be implemented providing API endpoints for chatbot.
Approach A clear and concise description of what you want to happen. Chat bot server should provide the following API:
POST /chat- creates a new chat entry and returns it's IDGET /chat/${chat_id}- returns chat message history with message contentsPOST /chat/${chat_id}- adds a new message to the chat, message is passed as request bodyGET /chat/${chat_id}/message/${message_id}- returns message specified by idDELETE /chat/${chat_id}/message/${message_id}- deletes message specified by id- WebSocket endpoint
assistant- accepts as input request object withchat_id,list<message_id> historyand streams generated response. When the LLM answer is fully returned to the client, the handler message should save response to the chat message history and only then send finaldoneevent
Model:
- chat -
id,title(empty for now), later shall be generated - message -
id,chat_id,created_date,role,content,attributes(dictionary)
Further development steps:
- Replace Sqlite with MongoDB to support requests from multiple processes/threads/users
- Offload LLM communication tasks to separate threads using Celery
- Introduce requests API, to reduce size of messages send to WebSocket endpoint