aws-genai-llm-chatbot
aws-genai-llm-chatbot copied to clipboard
Feature request: Expose consumable APIs
Consumable APIs would ensure that the chatbot functionality can be accessed by other products via APIs. It would make the integration with other products easy.
@massi-ang @bigadsoleiman are there any plans for such feature in the future?
Could you please share:
- Required functions to expose initially (e.g., just query and response)?
- Primary use case (e.g., integrating the chatbot into your own UI) -- anything specific you could share?
- Performance or scalability requirements?
@ystoneman The initial requirement is to ask questions and get responses. And maybe authentication as well so that only authenticated users can call the APIs. Primary use case is the ability to integrate the chatbot into multiple products (e.g. web app having own UI, mobile, desktop app etc). The APIs should be able to handle moderate load (e.g. 100 users calling the APIs simultaneously). And the response time of the APIs should be similar to what it is today in the web app.
Also the streaming response is desirable.
This issue is stale because it has been open for 60 days with no activity.
This issue was closed because it has been inactive for 30 days since being marked as stale.