[Feature Request]: Add Azure CLI credentials as an option
Do you need to file an issue?
- [x] I have searched the existing issues and this feature is not already filed.
- [ ] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
- [x] I believe this is a legitimate feature request, not just a question. If this is a question, please use the Discussions area.
Is your feature request related to a problem? Please describe.
My team and I use this library for production purposes in an ETL environment. Here we use managed identity. However, we would like to be able to use Azure CLI credentials for local development (not an API key, since we do not want to enable API keys on our Azure resources). Is supporting this form of authentication perhaps on your list?
Describe the solution you'd like
No response
Additional context
No response
If you remove the api_key from your settings.yaml and have authtype = azure_managed_identity, graphrag will use managed identity. For local development, one option with managed identity is to run az login first (either with your own identity or a specified identity - ensure it has the proper RBAC role assignment to use your AOAI resource), and then you can call the the graphrag library locally.
This will work if you’re using the graphrag CLI directly from the terminal or if you have additional code that calls into the graphrag API layer.
If you remove the
api_keyfrom your settings.yaml and have authtype = azure_managed_identity, graphrag will use managed identity. For local development, one option with managed identity is to runaz loginfirst (either with your own identity or a specified identity - ensure it has the proper RBAC role assignment to use your AOAI resource), and then you can call the the graphrag library locally.This will work if you’re using the graphrag CLI directly from the terminal or if you have additional code that calls into the graphrag API layer.
I did az login becore using graphrag index --root ./ragtest, still have:
❌ LLM configuration error detected. Exiting...
Error code: 403
Any idea what the problem would be? Thanks