Andrei Zhlobich
Andrei Zhlobich
Generated DAG files are not prefixed in any way. Also during deployment the whole target DAG folder is cleared. This makes harder to deploy multiple BF projects on single CloudComposer/Airflow....
Allow users to pass some extra adhoc parameters (untyped dict) to a workflow (via 'JobContext')
Current behaviour: dag generation of single workflow is skipped when python package can not be imported. This leads to incomplete deploys (some dags are generated, some are not). Expected: fail...
Pytest is much more user-friendly. 1. Better runner - colors, stacktraces, code snipets, hides "garbage" stdout/err/logger output. 2. Concise tests - just `def test(): assert ...` instead of junit-approach 3....
Allow users to reuse bigflow workflows/jobs by using custom "BigflowWorkflow" operator.
Currently logging and monitoring are optional extras. It easy to forget to enable them, they require some manual configuration. It might be good to softly impose their usage by making...
Metadata storage for bigflow jobs/workflows There are several usecases for simple document/key-value storage 1. Save (append) information about executed workflows/jobs. ID, run-time, docker hash, execution time, cost estimate, result etc......
Add a way to create `bigflow.bigquery.DatasetManager` from generic `bigflow.Config` This may simplify creation of mixed workflows (part is written with bigquery, part is based on dataproc etc).
Scaffold may be used by wrapper scripts or in some automation environment. Real use case: it may be integrated into script, which regenerates static project template (such script need to...