[ISSUE] Env Vars are not ignored when passing credentials directly
Description
I had to set DATABRICKS_HOST and DATABRICKS_TOKEN so that I could use managed databricks mlflow from an external environment. The problem is that my Databricks Workspace Client has to use a different service principal, so I passed it directly as azure_client_secret etc.
Reproduction Not required.
Expected behavior Ignore databricks host and token, if all required credentials are already directly passed.
Is it a regression? Afaik, it never worked.
My workaround for a similar situation (in this case PAT with DATABRICKS_CLIENT_ID and DATABRICKS_CLIENT_SECRET envars set)
workspace_client = WorkspaceClient(token=token, auth_type="pat")
config = workspace_client.config
config.client_id = None
config.client_secret = None
The error I had was below adding auth_type silenced the error but it still attempted to use the client_id and client_secret so I had to alter the default config created from envars.
ValueError: validate: more than one authorization method configured: oauth and pat. Config: host=https://adb-*********.azuredatabricks.net/, token=***, client_id=******, client_secret=***, warehouse_id=*****. Env: DATABRICKS_HOST, DATABRICKS_CLIENT_ID, DATABRICKS_CLIENT_SECRET, DATABRICKS_WAREHOUSE_ID
It feels that the config object should be auth_type aware and not set via ENVARS that do not match the requested auth type or the code that actually makes the request checks and reacts to the auth_type rather than the presence of unrelated config parameters