spring-cloud-dataflow
spring-cloud-dataflow copied to clipboard
issue in setting "maximumConcurrentTasks" while installing scdf through helm
Description: We have seen the SCDF document which says we can set the property maximumConcurrentTasks as below
spring.cloud.dataflow.task.platform.
but while installing the SCDF using helm installation, we need to set this in values.yaml file and information is missing in helm installation document. Could anyone help here?
Release versions: we are installing "bitnami/spring-cloud-dataflow:2.11.5-debian-12-r2"
Additional context: Below is our values.yaml file
server: image: registry: docker.io repository: bitnami/spring-cloud-dataflow tag: 2.11.5-debian-12-r2 digest: "" pullPolicy: IfNotPresent pullSecrets: [] debug: false composedTaskRunner: image: registry: docker.io repository: bitnami/spring-cloud-dataflow-composed-task-runner tag: 2.11.5-debian-12-r2 digest: "" configuration: streamingEnabled: false batchEnabled: true accountName: default trustK8sCerts: false containerPorts: http: 8080 jdwp: 5005 replicaCount: 1 updateStrategy: type: RollingUpdate startupProbe: enabled: false initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 livenessProbe: enabled: true initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 readinessProbe: enabled: true initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 networkPolicy: enabled: false allowExternal: false allowExternalEgress: false service: type: ClusterIP ports: http: 8080 ingress: enabled: true path: / pathType: ImplementationSpecific hostname: "xyz.com" pdb: create: false minAvailable: "" maxUnavailable: "" pdb: create: false skipper: enabled: false rabbitmq: enabled: false mariadb: enabled: false metrics: enabled: false pdb: create: false externalDatabase: host: "{{RDS-endpoint}}.rds.amazonaws.com driver: com.mysql.cj.jdbc.Driver dataflow: url: "{Database url}" username: password:
@VikasMGowda05 There is no direct property in the values file. Adding the appropriate environmental variable will ensure the value is set for the deployment platform.
server:
configuration:
extraEnvVars:
name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUM_CONCURRENT_TASKS'
value: 50
Description: We have seen the SCDF document which says we can set the property maximumConcurrentTasks as below
spring.cloud.dataflow.task.platform..accounts[].deployment.maximumConcurrentTasks`
but while installing the SCDF using helm installation, we need to set this in values.yaml file and information is missing in helm installation document. Could anyone help here?
Release versions: we are installing "bitnami/spring-cloud-dataflow:2.11.5-debian-12-r2"
Additional context: Below is our values.yaml file
server: image: registry: docker.io repository: bitnami/spring-cloud-dataflow tag: 2.11.5-debian-12-r2 digest: "" pullPolicy: IfNotPresent pullSecrets: [] debug: false composedTaskRunner: image: registry: docker.io repository: bitnami/spring-cloud-dataflow-composed-task-runner tag: 2.11.5-debian-12-r2 digest: "" configuration: streamingEnabled: false batchEnabled: true accountName: default trustK8sCerts: false containerPorts: http: 8080 jdwp: 5005 replicaCount: 1 updateStrategy: type: RollingUpdate startupProbe: enabled: false initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 livenessProbe: enabled: true initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 readinessProbe: enabled: true initialDelaySeconds: 120 timeoutSeconds: 1 periodSeconds: 20 failureThreshold: 6 successThreshold: 1 networkPolicy: enabled: false allowExternal: false allowExternalEgress: false service: type: ClusterIP ports: http: 8080 ingress: enabled: true path: / pathType: ImplementationSpecific hostname: "xyz.com" pdb: create: false minAvailable: "" maxUnavailable: "" pdb: create: false skipper: enabled: false rabbitmq: enabled: false mariadb: enabled: false metrics: enabled: false pdb: create: false externalDatabase: host: "{{RDS-endpoint}}.rds.amazonaws.com driver: com.mysql.cj.jdbc.Driver dataflow: url: "{Database url}" username: password:
You can format a file in issue text by using markdown
@corneil which one of the below is correct
server: configuration: extraEnvVars: name:' SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUM_CONCURRENT_TASKS' value: 50
server: configuration: extraEnvVars: name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEPLOYMENT_MAXIMUM_CONCURRENT_TASKS' value: 50
It should be the platform name. If you are using default, then it will be default.
default is the account name. The deployment option is only for Cloud Foundry.
You can add more accounts if you target other namespaces or clusters
Hi @corneil @cppwfs I have tried adding the above as you suggested, I am getting the below error . Could you please help here
server
Configuration :
extraEnvVars:
name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUM_CONCURRENT_TASKS'
value: 100
Error org.springframework.cloud.dataflow.rest.client.DataFlowClientException: Cannot launch task 'aggregator-job-psp'. The maximum concurrent task executions is at its limit [20].
@VikasMGowda05 Please try:
server:
extraEnvVars:
- name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUMCONCURRENTTASKS'
value: 100