dbt_artifacts
dbt_artifacts copied to clipboard
Support for BigQuery adapter
Thank you @charles-astrafy ! Would you mind adding a description with any key design decisions or things to be aware of in this? That would be super helpful!
@NiallRees
Main design decisions:
- In each "create_" macros, a BigQuery macro has been added
- Biggest design change is in the "upload_" macros. Current code could not accommodate BigQuery syntax seamlessly. Main macro name in each "upload_" file remained but content of that function is now to dispatch to various adapters to get the SQL content that will allow to insert records.
- The "insert_into_metadata" macro has also been decoupled and put within the "upload_results" macro to keep the code DRY.
- A logical "if" in case DML SQL is empty has been added to the macro "insert_into_metadata_table".
All those changes are in commit 0d5e732.
Other commits details:
8f8286b: Feature for incremental logic in staging tables. Nice one in case you want to materialize your staging tables; the logic is present in the SQL but won't be applied if you keep those models as views. 40cc3ef: Add bytes_processed column in model_executions models. Relevant information for BigQuery. 628f680: Add configurations for integration tests for BigQuery. All code for testing the BigQuery adapter. 6b9d101: some README update. bc9ea30: only consider bytes_processed column if adapter type is BigQuery
Let me know if you need any further explanation and we can jump in a call if you prefer.
Hey @charles-astrafy left a couple of comments and got the BQ integration tests running now. Let me know if I can be of any help to getting this over the line and I appreciate your patience!
Thank you for your comments. Will block some time this weekend to work on it and will keep you posted as soon as those small changes are done.
Great job for the BQ integration tests.