[ISSUE] `databricks_model_serving` updates traffic_config routes every time.
Configuration
resource "databricks_model_serving" "this" {
for_each = local.serving_endpoints
provider = databricks.workspace
name = each.key
route_optimized = each.value.route_optimized
config {
served_entities {
name = each.key
entity_name = each.value.entity_name
workload_size = each.value.workload_size
scale_to_zero_enabled = can(each.value.scale_to_zero_enabled) ? each.value.scale_to_zero_enabled : true
}
traffic_config {
routes {
served_model_name = each.key
traffic_percentage = 100
}
}
}
dynamic "tags" {
for_each = merge(local.static_tags, { for obj in local.dynamic_tags[each.key] : obj.key => obj.value })
content {
key = tags.key
value = tags.value
}
}
}
My output (with nothing changed) shows the following plan:
# databricks_model_serving.this["stage-hq-ai-user-lookup"] will be updated in-place
~ resource "databricks_model_serving" "this" {
id = "{id}"
name = "{name}"
# (2 unchanged attributes hidden)
~ config {
~ traffic_config {
~ routes {
+ served_model_name = "{same_value}"
# (1 unchanged attribute hidden)
}
}
# (1 unchanged block hidden)
}
# (4 unchanged blocks hidden)
}
Expected Behavior
Nothing - the state matches so no update is required.
Actual Behavior
TF plan shows an update to the resource as it "does not match state".
Steps to Reproduce
- define resource using documentation
- create resources using tf apply
- don't change output, and run tf apply again and you'll see an update even tho the values have not changed.
Terraform and provider versions
Version 1.48.3
Is it a regression?
No. I tried 1.47.0
@drewipsonhq could you check if the fix in 1.49.0 resolves this?
Hi, I am seeing the same issue in 1.70.0.
When defining an endpoint to serve a feature spec, I am getting an error when I do not define a served_model_name, so I define one. When I inspect the state though, the value shows as null prompting a diff in terraform plan, although the state is not different (or it shouldn't be at least).
It appears that for some reason when defining endpoints for feature specs, the served_model_name does not persist, but is still required upon creation by the provider, which leads to inconsistent behavior, and noisy tf plans.
@nkvuong
Does anybody have any guidance on this issue?
This should be fixed in 1.85.0. Please use the served_entity_name instead of the served_model_name going forward. It should resolve the drift issue for feature spec endpoints.
Example:
...
traffic_config {
routes {
# served_model_name = "feature_spec_endpoint" # Old
served_entity_name = "feature_spec_endpoint" # New
traffic_percentage = 100
}
}