matt-winkler

Results 20 comments of matt-winkler

Chime that I'm very aligned with the consumer project importing in a separate space (new thing) from `dbt_project.yml` and `packages.yml` - maybe `datasets.yml` or that's where the `contracts.yml` fits. It...

``` -- in a schema.yml, dbt_project.yml, or .sql file with node configs. schema.yml only shown for simplicity models: - name: my_first_dbt_model description: "A starter dbt model" contracts: published: true ```...

- https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/contracts/graph/manifest.py - https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/parser/manifest.py

Envisioned process: * contract configs set in the `dbt_project.yml`, `schema.yml` or `model.sql` files. * As part of project parsing in an upstream project, dbt identifies when there are node contracts...

Bit of devil's advocate thought - do we already have a sufficient producer mechanism (or the start of it without reinventing) with exposures? Apologies if I missed this discussion somewhere.

What would be lost by disallowing env-aware logic outside the `dbt_project.yml`?

Hey @b-per to be clear, the request is to enable setting this at the **job level** vs. the connection level. The PR you linked appears to be connection level. Seems...

Hey @b-per can you confirm [this](https://github.com/dbt-labs/terraform-provider-dbtcloud/blob/main/examples/resources/dbtcloud_extended_attributes/resource.tf) is the extended attributes setting at the environment level? Looks like it just would appreciate a quick yes / no.

Would it be reasonable to enable an additional configuration on `table`, `snapshot` and `incremental` materialization to `clone_from` another database location / environment? Going the materialization route enables parallelism. It also...

For Snowflake: ``` -- in models/fct_orders.sql {{ config( materialized = 'table', tags=['finance'], clone_from={'database': 'analytics_mwinkler_dbt_workspace', 'schema': 'dbt_mwinkler'} ) }} ``` ``` -- in macros/materializations/snowflake__create_table_as.sql {% macro snowflake__create_table_as(temporary, relation, compiled_code, language='sql') -%}...