Devis Lucato
Devis Lucato
hi @GryBsh the solution supports Azure Form Recognizer, now known as Azure AI Document Intelligence.
@GryBsh sorry about the misunderstanding. What I meant to say is that KM has integrated Azure Form Recognizer as an optional OCR solution, however, the integration is used only for...
hi @roldengarm did you try the `SearchAsync` method? if works similarly to AskAsync, returning only records, without using LLMs.
MS SQL server connector is now available in the repo, thanks to @kbeaugrand
here's another take on long term and short term memory: https://github.com/microsoft/kernel-memory/tree/main/examples/302-dotnet-sk-km-chat * short term: the chat history acts as short term memory, so it's maintained only while the chat is...
Please feel free to use the poll at https://github.com/microsoft/kernel-memory/discussions/532 to vote for this feature
Thanks for including the logs. As you can see the error message includes the reason: > Azure OpenAI API version 2023-12-01-preview have exceeded call rate limit of your current OpenAI...
When using the service with the underlying queues, the ingestion should continue, retrying. I haven't tested this specific scenario though. If the ingestion stops then it's a bug. On the...
Update: @alkampfergit is working on embedding batch generation, which will reduce the number of requests, see for example PR https://github.com/microsoft/kernel-memory/pull/526 and https://github.com/microsoft/kernel-memory/pull/531
> errors about token limits might not be relevant, but consider also that there's a max size for each chunk of text. Each model can handle a different max amount....