terraform-provider-google
terraform-provider-google copied to clipboard
google_dataflow_job - Cannot set autoscalingAlgorithm to `THROUGHPUT_BASED`
Community Note
- Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
- Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
- If you are interested in working on this issue or have submitted a pull request, please leave a comment.
- If an issue is assigned to a user, that user is claiming responsibility for the issue.
- Customers working with a Google Technical Account Manager or Customer Engineer can ask them to reach out internally to expedite investigation and resolution of this issue.
Terraform Version
Terraform v1.8.2 on darwin_arm64
- provider registry.terraform.io/hashicorp/google v4.82.0
- provider registry.terraform.io/hashicorp/google-beta v4.82.0
- provider registry.terraform.io/hashicorp/random v3.5.1
Affected Resource(s)
google_dataflow_job
Terraform Configuration
resource "google_dataflow_job" "dataflow_job" {
project = var.project_id
name = "dataflow-job"
template_gcs_path = "gs://dataflow-templates/2024-03-27-00_RC00/Cloud_PubSub_to_Splunk"
temp_gcs_location = "XXX"
service_account_email = var.service_account_email
machine_type = var.machine_type
max_workers = var.max_workers
zone = var.zone
skip_wait_on_job_termination = true
parameters = {
autoscalingAlgorithm = "THROUGHPUT_BASED"
inputSubscription = var.pubsub_subscription_id
outputDeadletterTopic = google_pubsub_topic.gcloud_dataflow_deadletter_pubsub_topic.id
parallelism = var.max_workers * local.vCPUs * 2
url = "URL"
batchCount = 50
includePubsubMessage = "true"
disableCertificateValidation = "sure"
enableBatchLogs = true
enableGzipHttpCompression = true
tokenSource = "SECRET_MANAGER"
tokenSecretId = "XXX"
javascriptTextTransformGcsPath = "gs://bucket/file.js"
javascriptTextTransformFunctionName = "function_name"
javascriptTextTransformReloadIntervalMinutes = 15
}
region = var.region
subnetwork = var.subnetwork
network = var.network
ip_configuration = "WORKER_IP_PRIVATE"
additional_experiments = ["min_num_workers=${var.min_workers}"]
lifecycle {
ignore_changes = [
additional_experiments # Ignore default experiments that may be added by Dataflow templates API
]
replace_triggered_by = [
terraform_data.topic_replacement,
terraform_data.subscription_replacement,
google_pubsub_topic.gcloud_dataflow_deadletter_pubsub_topic,
google_pubsub_subscription.gcloud_dataflow_deadletter_pubsub_sub
]
}
}
Debug Output
No response
Expected Behavior
Created new job with autoscalingAlgorithm
equal to THROUGHPUT_BASED
Actual Behavior
It throws error of incompatible error:
│ Error: googleapi: Error 400: The template parameters are invalid.
│ Details:
│ [
│ {
│ "@type": "type.googleapis.com/google.dataflow.v1beta3.InvalidTemplateParameters",
│ "parameterViolations": [
│ {
│ "description": "Unrecognized parameter",
│ "parameter": "autoscalingAlgorithm"
│ }
│ ]
│ }
│ ]
│ , badRequest
│
│ with module.zafin_anzplus_dataflow_job[0].google_dataflow_job.dataflow_job,
│ on ../../modules/logging-pipeline/pipeline.tf line 24, in resource "google_dataflow_job" "dataflow_job":
Steps to reproduce
-
terraform apply
Important Factoids
-
We upgraded from template version 2023-11-07-00_RC00 to 2024-03-27-00_RC00
-
version 2023-11-07-00_RC00 set the parameter
autoscalingAlgorithm
, without being provided via tf arg. -
Creating a dataflow job manually via UI using latest version (2024-03-27-00_RC00) successfully sets the
autoscalingAlgorithm
parameter
References
#17570 - fix was implemented for google_dataflow_flex_template_job
, but not google_dataflow_job
b/339853870
Confirmed issue!
After run terraform apply it returns an error 400: The template parameters are invalid