cromwell
cromwell copied to clipboard
Gloud life science v2Beta - hello world task takes almost 3 minutes
Hello,
I'm trying to connect cromwell with Gloud life science (v2beta) everything works as it should except for the problem that it takes almost 3 minutes to complete a simple hello world task. I am using the config file recommended by the documentation to run.
I would like to ask for advice on how to solve this problem. Thank you a lot.
config file:
google {
application-name = "xxxxx-xxxxx"
auths = [
{
name = "application-default"
scheme = "application_default"
},
{
name = "xxxxx-xxxxx"
scheme = "service_account"
service-account-id = "[email protected]"
pem-file = "/xxxxx/xxxxx.pem"
},
{
name = "user-service-account"
scheme = "user_service_account"
}
]
}
backend {
default = PAPIv2
providers {
PAPIv2 {
actor-factory = "cromwell.backend.google.pipelines.v2beta.PipelinesApiLifecycleActorFactory"
config {
# Google project
project = "xxxxx-cromwell"
# Base bucket for workflow executions
root = "gs://xxxxx-cromwell_bucket"
# Make the name of the backend used for call caching purposes insensitive to the PAPI version.
name-for-call-caching-purposes: PAPI
# Emit a warning if jobs last longer than this amount of time. This might indicate that something got stuck in PAPI.
slow-job-warning-time: 24 hours
# Set this to the lower of the two values "Queries per 100 seconds" and "Queries per 100 seconds per user" for
# your project.
#
# Used to help determine maximum throughput to the Google Genomics API. Setting this value too low will
# cause a drop in performance. Setting this value too high will cause QPS based locks from Google.
# 1000 is the default "Queries per 100 seconds per user", 50000 is the default "Queries per 100 seconds"
# See https://cloud.google.com/genomics/quotas for more information
genomics-api-queries-per-100-seconds = 25000
# Polling for completion backs-off gradually for slower-running jobs.
# This is the maximum polling interval (in seconds):
maximum-polling-interval = 600
# Optional Dockerhub Credentials. Can be used to access private docker images.
dockerhub {
# account = ""
# token = ""
}
#docker-image-cache-manifest-file = "gs://xxxxx-xxxxx/xxxxx.json"
# Number of workers to assign to PAPI requests
request-workers = 3
# Optional configuration to use high security network (Virtual Private Cloud) for running jobs.
# See https://cromwell.readthedocs.io/en/stable/backends/Google/ for more details.
# virtual-private-cloud {
# network-label-key = "network-key"
# auth = "application-default"
# }
# Global pipeline timeout
# Defaults to 7 days; max 30 days
# pipeline-timeout = 7 days
genomics {
# A reference to an auth defined in the `google` stanza at the top. This auth is used to create
# Pipelines and manipulate auth JSONs.
auth = "application-default"
// alternative service account to use on the launched compute instance
// NOTE: If combined with service account authorization, both that serivce account and this service account
// must be able to read and write to the 'root' GCS path
compute-service-account = "default"
# Endpoint for APIs, no reason to change this unless directed by Google.
endpoint-url = "https://lifesciences.googleapis.com/"
# Currently Cloud Life Sciences API is available only in `us-central1` and `europe-west2` locations.
location = "us-central1"
# Restrict access to VM metadata. Useful in cases when untrusted containers are running under a service
# account not owned by the submitting user
restrict-metadata-access = false
# Pipelines v2 only: specify the number of times localization and delocalization operations should be attempted
# There is no logic to determine if the error was transient or not, everything is retried upon failure
# Defaults to 3
localization-attempts = 3
# Specifies the minimum file size for `gsutil cp` to use parallel composite uploads during delocalization.
# Parallel composite uploads can result in a significant improvement in delocalization speed for large files
# but may introduce complexities in downloading such files from GCS, please see
# https://cloud.google.com/storage/docs/gsutil/commands/cp#parallel-composite-uploads for more information.
#
# If set to 0 parallel composite uploads are turned off. The default Cromwell configuration turns off
# parallel composite uploads, this sample configuration turns it on for files of 150M or larger.
parallel-composite-upload-threshold="150M"
}
# Controls how batched requests to PAPI are handled:
batch-requests {
timeouts {
# Timeout when attempting to connect to PAPI to make requests:
# read = 10 seconds
# Timeout waiting for batch responses from PAPI:
#
# Note: Try raising this value if you see errors in logs like:
# WARN - PAPI request worker PAPIQueryWorker-[...] terminated. 99 run creation requests, 0 status poll requests, and 0 abort requests will be reconsidered. If any of those succeeded in the cloud before the batch request failed, they might be run twice.
# ERROR - Read timed out
# connect = 10 seconds
}
}
filesystems {
gcs {
# A reference to a potentially different auth for manipulating files via engine functions.
auth = "application-default"
# Google project which will be billed for the requests
project = "xxxxx-xxxxx-xxxxx"
caching {
# When a cache hit is found, the following duplication strategy will be followed to use the cached outputs
# Possible values: "copy", "reference". Defaults to "copy"
# "copy": Copy the output files
# "reference": DO NOT copy the output files but point to the original output files instead.
# Will still make sure than all the original output files exist and are accessible before
# going forward with the cache hit.
duplication-strategy = "copy"
}
}
}
default-runtime-attributes {
cpu: 4
failOnStderr: false
continueOnReturnCode: 0
memory: "2 GB"
bootDiskSizeGb: 10
# Allowed to be a String, or a list of Strings
disks: "local-disk 10 SSD"
noAddress: false
preemptible: 0
zones: ["us-central1-a", "us-central1-b"]
}
include "papi_v2_reference_image_manifest.conf"
}
}
}
}
WDL:
task hello {
String addressee
command {
echo "Hello ${addressee}! Welcome to Cromwell . . . on Google Cloud!"
}
output {
String message = read_string(stdout())
}
runtime {
docker: "ubuntu:latest"
}
}
workflow wf_hello {
call hello
output {
hello.message
}
}
input
{
"wf_hello.hello.addressee": "World"
}
Gcloud log (edited):
done: true
metadata:
'@type': type.googleapis.com/google.cloud.lifesciences.v2beta.Metadata
createTime: '2021-08-03T15:21:55.984657Z'
endTime: '2021-08-03T15:24:03.533702405Z'
events:
- description: Worker released
timestamp: '2021-08-03T15:24:03.533702405Z'
workerReleased:
instance: google-pipelines-worker-xxxxxx
zone: us-central1-b
- containerStopped:
actionId: 19
description: Stopped running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:24:02.823519462Z'
- containerStarted:
actionId: 19
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:57.785552960Z'
- containerStopped:
actionId: 18
description: Stopped running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/1xxxxxx.sh && chmod u+x /tmp/1xxxxxx.sh
&& sh /tmp/1xxxxxx.sh"
timestamp: '2021-08-03T15:23:57.673915859Z'
- containerStarted:
actionId: 18
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/1xxxxxx.sh && chmod u+x /tmp/1xxxxxx.sh
&& sh /tmp/1xxxxxx.sh"
timestamp: '2021-08-03T15:23:55.116803722Z'
- containerStopped:
actionId: 17
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ delocalization."
timestamp: '2021-08-03T15:23:55.018741967Z'
- containerStarted:
actionId: 17
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ delocalization."
timestamp: '2021-08-03T15:23:54.032260045Z'
- containerStopped:
actionId: 16
description: Stopped running "-c /bin/bash /cromwell_root/gcs_delocalization.sh"
timestamp: '2021-08-03T15:23:53.931721311Z'
- containerStarted:
actionId: 16
description: Started running "-c /bin/bash /cromwell_root/gcs_delocalization.sh"
timestamp: '2021-08-03T15:23:48.768588581Z'
- containerStopped:
actionId: 15
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ delocalization."
timestamp: '2021-08-03T15:23:48.668259011Z'
- containerStarted:
actionId: 15
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ delocalization."
timestamp: '2021-08-03T15:23:47.678046638Z'
- containerStopped:
actionId: 14
description: Stopped running "/cromwell_root/script"
timestamp: '2021-08-03T15:23:47.582884870Z'
- containerStarted:
actionId: 14
description: Started running "/cromwell_root/script"
timestamp: '2021-08-03T15:23:46.563913382Z'
- containerStopped:
actionId: 13
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Running\\ user\\ action:\\ docker\\ run\\ -v\\ /mnt/local-disk:/cromwell_root\\
--entrypoint\\=/bin/bash\\ ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9\\
/cromwell_root/script"
timestamp: '2021-08-03T15:23:46.476557520Z'
- containerStarted:
actionId: 13
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Running\\ user\\ action:\\ docker\\ run\\ -v\\ /mnt/local-disk:/cromwell_root\\
--entrypoint\\=/bin/bash\\ ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9\\
/cromwell_root/script"
timestamp: '2021-08-03T15:23:44.494600959Z'
- containerStopped:
actionId: 12
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ localization."
timestamp: '2021-08-03T15:23:44.309713763Z'
- containerStarted:
actionId: 12
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ localization."
timestamp: '2021-08-03T15:23:42.320055355Z'
- containerStopped:
actionId: 11
description: Stopped running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:42.235400991Z'
- containerStarted:
actionId: 11
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:39.302953173Z'
- containerStopped:
actionId: 10
description: Stopped running "-c /bin/bash /cromwell_root/gcs_localization.sh"
timestamp: '2021-08-03T15:23:39.208412437Z'
- containerStarted:
actionId: 10
description: Started running "-c /bin/bash /cromwell_root/gcs_localization.sh"
timestamp: '2021-08-03T15:23:33.726944614Z'
- containerStopped:
actionId: 9
description: Stopped running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:33.642652013Z'
- containerStarted:
actionId: 9
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:31.001010559Z'
- containerStopped:
actionId: 8
description: Stopped running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:30.903326724Z'
- containerStarted:
actionId: 8
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:25.251063330Z'
- containerStopped:
actionId: 7
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ localization."
timestamp: '2021-08-03T15:23:25.112961881Z'
- containerStarted:
actionId: 7
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ localization."
timestamp: '2021-08-03T15:23:21.752495384Z'
- containerStarted:
actionId: 6
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx\"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:21.250436651Z'
- containerStarted:
actionId: 5
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:20.861110195Z'
- containerStarted:
actionId: 4
description: Started running "-c python -c 'import base64; print(base64.b64decode(\"xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh"
timestamp: '2021-08-03T15:23:20.552485318Z'
- containerStopped:
actionId: 3
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ container\\ setup."
timestamp: '2021-08-03T15:23:20.456254269Z'
- containerStarted:
actionId: 3
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Done\\ container\\ setup."
timestamp: '2021-08-03T15:23:19.466880713Z'
- containerStopped:
actionId: 2
description: Stopped running "-c mkdir -p /cromwell_root && chmod -R a+rwx /cromwell_root"
timestamp: '2021-08-03T15:23:19.381266405Z'
- containerStarted:
actionId: 2
description: Started running "-c mkdir -p /cromwell_root && chmod -R a+rwx /cromwell_root"
timestamp: '2021-08-03T15:23:18.421326834Z'
- containerStopped:
actionId: 1
description: Stopped running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ container\\ setup."
timestamp: '2021-08-03T15:23:18.315160082Z'
- containerStarted:
actionId: 1
description: Started running "-c printf '%s %s\\n' \"$(date -u '+%Y/%m/%d %H:%M:%S')\"
Starting\\ container\\ setup."
timestamp: '2021-08-03T15:23:16.978004617Z'
- description: Stopped pulling "ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9"
pullStopped:
imageUri: ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9
timestamp: '2021-08-03T15:23:16.274159840Z'
- description: Started pulling "ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9"
pullStarted:
imageUri: ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9
timestamp: '2021-08-03T15:23:12.246258428Z'
- description: Stopped pulling "gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim"
pullStopped:
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
timestamp: '2021-08-03T15:23:12.246251722Z'
- description: Started pulling "gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim"
pullStarted:
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
timestamp: '2021-08-03T15:22:42.922496298Z'
- description: Worker "google-pipelines-worker-xxxxxx"
assigned in "us-central1-b" on a "custom-1-2048" machine
timestamp: '2021-08-03T15:22:07.789742627Z'
workerAssigned:
instance: google-pipelines-worker-xxxxxx
machineType: custom-1-2048
zone: us-central1-b
labels:
cromwell-workflow-id: cromwell-xxxxxx
wdl-task-name: hello
pipeline:
actions:
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Starting\ container\ setup.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: ContainerSetup
timeout: 300s
- commands:
- -c
- mkdir -p /cromwell_root && chmod -R a+rwx /cromwell_root
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: ContainerSetup
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Done\ container\ setup.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: ContainerSetup
timeout: 300s
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Background
runInBackground: true
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
outputName: stderr
tag: Background
mounts:
- disk: local-disk
path: /cromwell_root
runInBackground: true
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
outputName: stdout
tag: Background
mounts:
- disk: local-disk
path: /cromwell_root
runInBackground: true
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Starting\ localization.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: Localization
timeout: 300s
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Localization
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Localization
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- /bin/bash /cromwell_root/gcs_localization.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Localization
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Localization
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Done\ localization.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: Localization
timeout: 300s
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Running\ user\ action:\
docker\ run\ -v\ /mnt/local-disk:/cromwell_root\ --entrypoint\=/bin/bash\
ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9\
/cromwell_root/script
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: UserAction
timeout: 300s
- commands:
- /cromwell_root/script
entrypoint: /bin/bash
imageUri: ubuntu@sha256:1e48201ccc2ab83afc435394b3bf70af0fa0055215c1e26a5da9b50a1ae367c9
labels:
tag: UserAction
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Starting\ delocalization.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: Delocalization
timeout: 300s
- commands:
- -c
- /bin/bash /cromwell_root/gcs_delocalization.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Delocalization
mounts:
- disk: local-disk
path: /cromwell_root
- commands:
- -c
- printf '%s %s\n' "$(date -u '+%Y/%m/%d %H:%M:%S')" Done\ delocalization.
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
logging: Delocalization
timeout: 300s
- alwaysRun: true
commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/1xxxxxx.sh && chmod u+x /tmp/1xxxxxx.sh
&& sh /tmp/1xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Delocalization
- alwaysRun: true
commands:
- -c
- python -c 'import base64; print(base64.b64decode("xxxxxx"));'
> /tmp/xxxxxx.sh && chmod u+x /tmp/xxxxxx.sh
&& sh /tmp/xxxxxx.sh
entrypoint: /bin/sh
imageUri: gcr.io/google.com/cloudsdktool/cloud-sdk:276.0.0-slim
labels:
tag: Delocalization
environment:
MEM_SIZE: '2.0'
MEM_UNIT: GB
resources:
virtualMachine:
bootDiskSizeGb: 12
bootImage: projects/cos-cloud/global/images/family/cos-stable
disks:
- name: local-disk
sizeGb: 10
type: pd-ssd
labels:
cromwell-workflow-id: xxxxxx
goog-pipelines-worker: 'true'
wdl-task-name: hello
machineType: custom-1-2048
network: {}
nvidiaDriverVersion: 450.51.06
serviceAccount:
email: default
scopes:
- https://www.googleapis.com/auth/compute
- https://www.googleapis.com/auth/devstorage.full_control
- https://www.googleapis.com/auth/cloudkms
- https://www.googleapis.com/auth/userinfo.email
- https://www.googleapis.com/auth/userinfo.profile
- https://www.googleapis.com/auth/monitoring.write
- https://www.googleapis.com/auth/bigquery
- https://www.googleapis.com/auth/cloud-platform
volumes:
- persistentDisk:
sizeGb: 10
type: pd-ssd
volume: local-disk
zones:
- us-central1-a
- us-central1-b
timeout: 604800s
startTime: '2021-08-03T15:22:07.789742627Z'
name: projects/xxxxxx/locations/us-central1/operations/xxxxxx
response:
'@type': type.googleapis.com/cloud.lifesciences.pipelines.RunPipelineResponse
Hello and welcome to the Cromwell repo. Three minutes is about what I would expect from personal experience, as a minimum time to run any task.
Consider that Life Sciences allocates, starts, and pulls Docker on a dedicated VM just to print "hello world". It will never look favorable for small tasks whose execution time is short compared to VM setup time.
Okay, thank you so much for the answer.
In this case, then, I would ask if using
docker-image-cache-manifest-file = "gs://xxxxx-xxxxx/xxxxx.json"
is it possible to achieve acceleration in Google Life Sciences or is it possible to use this cache method only for acceleration when running on the local backend?
I am thinking of such a solution, do you think it is in line with cromwell's good practices? -> distribute calculations according to whether I estimate they will be heavy and if so send them to google life sciences for calculation and if not calculate them on the local backend?