caf-terraform-landingzones
caf-terraform-landingzones copied to clipboard
[feature] Databricks Addon - upgrade to databricks/databricks provider and implement new features.
Upgrade the Databricks Addon: https://github.com/Azure/caf-terraform-landingzones/tree/main/caf_solution/add-ons/databricks_v1
- change the provider: from:
source = "databrickslabs/databricks"
version = "~> 0.3.9"
to:
source = "databricks/databricks"
version = "~> 1.3.0"
Implement the following:
- AWS
- Compute
- Databricks SQL
- Log Delivery
- MLflow
- Security
- Storage
- Unity Catalog
- Workspace
Ref: https://registry.terraform.io/providers/databricks/databricks/latest/docs
@arnaudlh @LaurentLesle Do we add a new addon databricks_v2 or upgrade the existing one as it has only cluster implemented.
@nusrath432 Usually our criteria would be to create a databricks_v2
if we are unable to preserve a config created with 0.3.9
when we upgrade to 1.3.0
. If you have some bandwidth for testing that might help understanding the the behavior! Thanks!
Hi @nusrath432 ,
Would like to check if including support for databbricks notebooks is under consideration. I have created an issue #420 for the same. https://github.com/Azure/caf-terraform-landingzones/issues/420
Including databricks notebook resource would help to automate the notebook deployment into the workspace.
Thanks, cc: @arnaudlh
@arnaudlh Thanks, I agree and yes, normally I would have just created databricks_v2
but since databricks_v1
has so little implemented, I felt we could upgrade itself. For now, I think databricks_v2
is good and you can take a decision at merge time when it is fully tested.
@RameshIlla: Yes, the intention is to implement in an iterative way so that all the resources supported by https://registry.terraform.io/providers/databricks/databricks/latest/docs are added (AWS exception). It has to be side hustle for me but I'll try to get the structure ready soon.
Meanwhile, you could delete #420 and track it in a single issue please.
Thanks @nusrath432 , I have marked #420 as closed so we can continue tracking here.
@arnaudlh @LaurentLesle How can we do the following from this addon: 1 - export the outputs attributes of the databricks_cluster to a keyvault as secrets - the keyvault could be remote landingzone or current landingzone I am aware that within the main CAF module, we could use a block like below:
dynamic_keyvault_secrets = {
mykeyvault = {
myattribute = {
output_key = "resource-type"
resource_key = "myresourcekey"
secret_name = "mysecretname"
attribute_key = "myattribute"
}
}
}
but this only works within the CAF main module and for the current landingozne
- how can we use same design pattern within addons for remote keyvaults or new current landingzone keyvault.
Any input on the above use case is highly appreciated please.
@LaurentLesle Can you guide how the dynamic_keyvault_secrets are to be implemented in a add-on module, please.