argo-workflows
argo-workflows copied to clipboard
Allow reading/writing artifacts from within the execution process
Summary
Instead of configuring input/output artifact keys in advance - I want to be able to calculate them based on runtime conditions and then, directly from my script, run commands like:
# get an artifact from the default repository using the key
argoexec get $artifact_key $target_folder
# save an artifact to the default repository using the key
argoexec upload $artifact_key ${file/folder} from within my script.
Use Cases
- Dynamic cache based on digests of
package.json
/pom.xml
/go.mod
files. - But also this can help skip operations like compiling binaries of code visited in the past etc etc.
NPM_CACHE_KEY=$(sha256sum package.json | cut -d ' ' -f 1)
# get cache if exists, do nothing otherwise
argoexec get $NPM_CACHE_KEY node_modules
npm install
argoexec upload $NPM_CACHE_KEY node_modules
# execute the javascript script
Love this enhancement proposal? Give it a 👍. We prioritise the proposals with the most 👍.
Have you tried this? https://argoproj.github.io/argo-workflows/data-sourcing-and-transformation/
@terrytangyuan IIUC that feature is good for data manipulation/ reads. How can I use it for storing install based on the digest of some file?