azure-search-openai-demo
azure-search-openai-demo copied to clipboard
Is this design expensive than directly use azure openai console deployment with ai search index?
Please provide us with the following information:
In this code project, i found it is getting the file content by ai search, then pass the question and ai search three document content result to azure open ai.
when using console, i see the deployment is able to integrated with ai search already. Can i directly call this api and then azure open ai will handle the integration with ai search directly? Then i dont need to pass document content to azure open ai to reduce token spent.
is it possible or concept correct?
This issue is for a: (mark with an x)
- [ ] bug report -> please search issues before submitting
- [ ] feature request
- [x] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)
Minimal steps to reproduce
Any log messages given by the failure
Expected/desired behavior
OS and Version?
Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)
azd version?
run
azd versionand copy paste here.
Versions
Mention any other details that might be useful
Thanks! We'll be in touch soon.
The Azure OpenAI models support a dataSources parameter for directly connecting to an Azure AI search index, and that's whats used by Azure OpenAI On Your Data in the studio. If you'd like code that does that, try this repo: https://github.com/microsoft/sample-app-aoai-chatGPT
We don't yet have that API integrated here, as I believe it might be incompatible with GPT4-vision, we're investigating.
You would still pay for tokens consumed, as far as I understand, however.
Thanks.
close