Pari
Pari
Currently user can specify custom params as hardcoded value ``` wf = Workflow( "my-workflow", default_cluster=cluster, common_task_parameters={'my_key': 'my_value'} ) ``` And there is no way to pass params at deployment time...
Databricks workflow now natively supports parent workflows to have child workflows. Similar to Airflow - Dags and sub-dags. refer - https://www.databricks.com/blog/modular-orchestration-databricks-workflows **Cloud Information** - [ ] AWS - [ ]...
## Description User can choose to deploy one or more workflows in project when using local mode ## Related Issue https://github.com/Nike-Inc/brickflow/issues/111 ## Motivation and Context ## How Has This Been...
## Is your feature request related to a problem? Please describe. Sometimes, dataframe contains the column having reference to external url instead of value it self. eg. in below dataframe,...
## Is your feature request related to a problem? Please describe. ... ## Describe the solution you'd like ... ## Describe alternatives you've considered ... ## Additional context ...
## Describe the bug databricks version of spark 3.5 supports SqlTransform on streaming dataframes unlike open source version apache spark. ## Steps to Reproduce ## Expected behavior ## Screenshots ##...