Doug Beatty
Doug Beatty
Thanks for giving that a try @kevin-chao ! I tried it myself and was able to see the same thing as you. It seems like the missing piece might be...
As @ChenyuLInx already mentioned, the snippet [here](https://github.com/dbt-labs/dbt-core/blob/8a1b9276f96def2871528572928e04c4414de425/core/dbt/task/retry.py#L68) shows why providing `--state` works. I did a write-up in https://github.com/dbt-labs/dbt-core/issues/9575#issuecomment-1947598817 to give examples why we should enable `DBT_TARGET_PATH` and `--target-path`. If we...
Dragons, indeed 🐉 -- thanks for raising this @dataders. ~If we emit a warning to the logs, a user could use the [`warn_error_options`](https://docs.getdbt.com/reference/global-configs/warnings) config to exclude that warning. But it...
Thanks for reaching out @kokorin ! The crux is that it's tricky to within the dbt implementation details to determine the difference between an _explicit_ null / none and an...
> ... behavior is different for freshness and loaded_at_field properties. > Freshness set to explicit null at table level overwrites `freshness` at source level, while `loaded_at_field` does not. Thanks for...
> I can work on pull request if you consider it possible. That would be awesome if you work on a pull request for this @kokorin 🏆 @graciegoheen and I...
@ChenyuLInx yeah, it makes good sense to do both bigquery and snowflake. And also consider the number objects within the schema. 👍
Thanks for raising this @jelstongreen ! As you noted, the current blocker to upgrading the Dockerfile in dbt-core to use python 3.11 is that dbt-spark is not currently compatible with...
Since this is resolved, restoring python:3.11 by reversing https://github.com/dbt-labs/dbt-core/pull/8445 should be un-blocked: - https://github.com/dbt-labs/dbt-spark/issues/864
I keep a collection of terminology [here](https://github.com/dbeatty10/Phippy-Goes-Fact-Finding?tab=readme-ov-file#cross-walk-of-temporal-terminology). I should add those bi-temporal definitions for "event time" and "processing time" from _Spark: The Definitive Guide_ ! I have a hard time...