jeremylatorre
jeremylatorre
Would be also interested by this as I already have a Kendra index as RAG for an other chatbot app
We'll probably need to work on this feature within next weeks. Did you have any lead about this feature? Parsing in frontend/Backend? Add the content in the pre-prompt inside tags?
Maybe the integration of bedrock agents will be the answer to this question?
Similar to what I experienced https://github.com/aws-samples/bedrock-claude-chat/issues/305
> > Similar to what I experienced #305 > > Have you resolved this issue? Any technique or thought? thank you No, not yet. But my thought was about the...
Lambda was a wrong lead, most-likely related to the search in vector database /conversation/related-documents
When I use my bot, it takes sometimes 17s before starting to render the content. The first step [Retrieve Knowledge] looks ok, but after that, cursor stay at the beginning...
Could it be related to the instantion of a Lambda function? The behavior only appear on the first inference for a new conversation
Ok I have digged a llittle bit on the logs and I found that the POST call to related-documents is very long by itself: Request POST /conversation/related-documents (15:15:32 - 15:15:44)...
I agree for DynamoDB, message encryption should be a must have. I'm a little bit concerned about performance issues by adding encryption in vectorStore.