argo-workflows icon indicating copy to clipboard operation
argo-workflows copied to clipboard

Access node logs as an artifact

Open sevberg opened this issue 4 years ago • 0 comments

Summary

Accessing a nodes output via steps.pipeline.outputs.result works great when a node's stdout is simple, but it doesn't work as well when the stdout is volumous or is complex (for example, a traceback with lots of odd characters). It also does not include contents of stderr. It would be nice if it were also possible to access the stdout/stderr outputs of a node as an artifact (e.g. under something like steps.pipeline.outputs.log_artifact), which can then be shared with follow-up tasks to do processing on that, post it somewhere, or send out specialized notifications.

To note, this is in principle possible already with the built-in artifacting system. But this requires first writing the logs to the external storage (e.g. S3), and then reading it back in in the next step. It would be nice if this was possible without leaving the k8s/argo domain.

Use Cases

When would you use this?

  • Automatic processing of error logs
  • Posting logs into a slack channel, or similar
  • Automated notifying of key individuals/teams depending on the contents of logs

Message from the maintainers:

Love this enhancement proposal? Give it a 👍. We prioritise the proposals with the most 👍.

sevberg avatar Dec 09 '21 18:12 sevberg