databricks-cli
databricks-cli copied to clipboard
[FEATURE] Add option to `databricks configure` command to define where to save the `.databrickscfg` file
Hi Team!
When I run the command:
databricks configure --host <my-host> --profile <my-profile> --token
At this point, it will save the databrickscfg file to the location: %USERPROFILE%\.databrickscfg (on Windows). As documented here.
However, I want to change the location for where this file is saved. I know that the docs state here that you can change it using the command:
setx DATABRICKS_CONFIG_FILE "<path-to-file>" /M
However, this does not work for me because:
- I do not have admin access on this PC to run this command, and
- I don't want to run a separate command to change this location; it's cleaner if it can be contained within the
databricks configurecommand.
Moreover, it's important that we can control the location where this file is saved, because it will contain the token or password values for our Databricks workspace. And due to security concerns, we need to be able to control where this file is saved, in order to prevent credentials leakage.
Therefore, I'd like to have the ability to define in the databricks config command which directory to save the file to. This can be an absolute directory or a directory relative to the location of the current command. This option can have a default value, so therefore if the user does not specify a location for where to save the .databrickscfg file, then it can be saved to the %USERPROFILE% location.
I'd recommend for the Option to perhaps look something like this:
-l, --location-for-config-file Location where to save the `.databrickscfg`
file. The default location is `~/.databrickscfg`
on Unix, Linux, or macOS, or
`%USERPROFILE%\.databrickscfg` on Windows.
So therefore, when you run this command:
databricks configure --help
You'll get something like this:
Usage: databricks configure [OPTIONS]
Configures host, authentication, and jobs-api version for the CLI.
Options:
-o, --oauth [Experimental] Use OAuth for authorization
-s, --scope [accounts|clusters|mlflow|offline_access|sql|unity-catalog]
[Experimental] Specify which OAuth scopes to
request access for
-t, --token
-f, --token-file TEXT Instead of reading the token from stdin,
read the token from a file provided by a
secret store.
--host TEXT Host to connect to.
--aad-token
--insecure DO NOT verify SSL Certificates
--jobs-api-version [2.0|2.1] API version to use for jobs. [default: 2.0]
--debug Debug Mode. Shows full stack trace on error.
--profile TEXT CLI connection profile to use. The default
profile is "DEFAULT".
-l, --location-for-config-file Location where to save the `.databrickscfg`
file. The default location is `~/.databrickscfg`
on Unix, Linux, or macOS, or
`%USERPROFILE%\.databrickscfg` on Windows.
-h, --help Show this message and exit.
Thank you.
Thanks for posting the issue.
Regarding your concern: AFAIK it is possible to configure per-user environment variables on Windows (under User Accounts). You need admin access only to set system-wide environment variables. That would also solve your second concern of having to run a separate command to configure the environment; if you configure it as a per-user environment variable then new terminal sessions inherit it.
For my curiosity, can you expand on your motivation for storing it in a non-default location?
We tend to keep parameters that don't change for a session in environment variables. This rule of thumb matches the configuration file perfectly so I'm not inclined to add a command line flag for it.