dbt-databricks
dbt-databricks copied to clipboard
add python models session submission method
Problem
When executing a dbt python model, users must choose between an all-purpose cluster or a job cluster to run Python models (see docs).
This requirement limits the ability to execute dbt models inline within an existing notebook, forcing model execution to be triggered outside of Databricks.
On the contrary , SQL models can leverage the session connection method, allowing them to be executed as part of an existing session. This separation of model logic from job cluster definitions enables orchestration systems to define clusters based on different considerations.
Request:
We propose introducing a similar session option for Python models. This feature would allow users to submit Python models to be executed within a given session, thereby decoupling model definitions from job cluster specifications.
Solution
The PR offers a new submission method - session. When selecting this method, the DBT python model compiled code will be executed on the same process DBT is being executed - assuming a spark session is available - this solution is equivalent to the session method
Notes
Related issue in dbt-spark , link
Synced with @dkruh36