Andrew Nester

Results 79 comments of Andrew Nester
trafficstars

Closing as no response, feel free to reopen if the issue persists

As you pointed out it's indeed required to pass the name and not the ID of the credentials. When no arguments passed to `databricks storage-credentials get` it prompts for available...

For now, renaming the clusters or providing value directly to --configure-cluster is the only option

Duplicate of https://github.com/databricks/cli/issues/1456

There were multiple improvements to CLI commands including better error messages and warning on incorrect JSON payload. Now this command works more intuitively and provides a better guidance if something...

@BenPhegan Hi Ben! Sorry for the late response and thanks for the provided examples. If I see it correctly, binding to all :8020 instead of localhost:8020 seems to help the...

By default, DABs exclude files and folders from syncing based on .gitgnore file if you're using Git. If you're not using Git, or don't want to include certain files in...

@mike-smith-bb DABs already supports building and automatic upload of JARs, so the configuration can look somewhat like this ``` artifacts: my_java_project: path: ./path/to/project build: "sbt package" type: jar files: -...

@mike-smith-bb yes, indeed. Then just using sync include section should work, does it for you? ``` sync: include: - target/scala-2.12/**/*.jar ``` Path can be defined in gitingore like syntax so...

Since version 0.224.0 DABs support uploading JARs to UC Volumes, you can find an example here: https://github.com/databricks/bundle-examples/blob/main/knowledge_base/spark_jar_task/databricks.yml You can omit all `artifacts` section if you don't want JAR to be...