I want to send message chat to fastapi api ?
I created fastapi backend include LLM , logic , How to config in ai-chatbot source? Thanks
one way is that u can make ur fastapi endpoint openaicomptabile endpoint..and then uses as an provider -const provider = createOpenAICompatible({ name: 'some-model-name', apiKey: "hello", baseURL: 'http://localhost:8000/api/v1/',
});
- const result = streamText({ model: provider('provider-name'), ...)
the stream text internally calls urs openai compatible "'http://localhost:8000/api/v1/'," and results are streamed back..
to make ur api openaai comatible implemenation check : https://github.com/adithya04dev/csql-agent/tree/main/backend and for chatbot implemenation: https://github.com/adithya04dev/csql-agent/tree/main/chatbot
I want to add more field in coreMessages in order to send to Fastapi ? such as faspi (Backend+LLM+langraph) need to receive json like { "input": "What is Apple?", "session_id": "test_session", "user_id": "test_user", "query": [], "action_type": "chat", "form": "" }
one approach can be ..just take/send as a user message by adding xml tags and parse at the server side ( backend api) to construct json..and there by call the langraph agent..
Do you have sample? Or I can add to content ? as below
{ "model": "provider-name", "messages": [ { "role": "user", "content": "{"input": "I want to book fight")", "session_id": "test_session", "user_id": "test_user", "query": [], "action_type": "chat", "form": ""}", "name": "name1" } ], "temperature": 5.67, "stream": true, "max_tokens": 1 }
Do you have sample? Or I can add to content ? as below
{ "model": "provider-name", "messages": [ { "role": "user", "content": "{"input": "I want to book fight")", "session_id": "test_session", "user_id": "test_user", "query": [], "action_type": "chat", "form": ""}", "name": "name1" } ], "temperature": 5.67, "stream": true, "max_tokens": 1 }
sorry for the long long delay! you can do it by just changing the chatrequest (class modal) in https://github.com/adithya04dev/csql-agent/blob/main/backend/app/api/chat.py like add necessary fields..
so that the backend can accepts necessary fields and now u can design backend functionality in below function!
@router.post("/v1/chat/completions")
async def chat_endpoint(request: ChatRequest):
Thanks, But Frontend (AI Chat bot) cannot add more field because I checked /api/chat/route.ts to have convert to core message So more field is loss, right? How to solve?