[Bug] Cannot Update DynamicTable
Is this a new bug in dbt-snowflake?
- [X] I believe this is a new bug in dbt-snowflake
- [X] I have searched the existing issues, and I could not find an existing issue for this bug
Current Behavior
When apply dbt run to an existing model materialized = 'dynamic_table, it fails with following error message:
SnowflakeDynamicTableConfig.__init__() missing 6 required positional arguments: 'name', 'schema_name', 'database_name', 'query', 'target_lag', and 'snowflake_warehouse'
> in macro dynamic_table_get_build_sql (macros/materializations/dynamic_table.sql)
> called by macro materialization_dynamic_table_snowflake (macros/materializations/dynamic_table.sql)
> called by model XXXXXX
Expected Behavior
Generate alter XXXX command to update the configuration.
After the investigation, I've found that it seems this query returns column name with uppercase but here refers it as lowercase and it returns None for necessary configurations.
Steps To Reproduce
- Deploy a model with
materialized = 'dynamic_table' - Run again
Relevant log output
No response
Environment
- OS: Ubuntu 22.04
- Python: 3.11
- dbt-core: 1.8.5
- dbt-snowflake: 1.8.3
Additional Context
No response
@kds1010 I'm struggling to recreate this issue. Per your message - does this mean that you create a dynamic table, successfully created it on the initial run and then subsequent runs, it fails? That error message seems specific to a column name.
Would you be able to provide a sample of your dynamic table code? Did you also change from a different materialization (like table) over to dynamic table?
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days.
Closing this issue for now due to lack of response. Can reopen if things change
@amychen1776 Sorry for missing the context and my late reply. I had a bit time and tested it again and reproduced the issue. (with latest 1.9.0 too)
does this mean that you create a dynamic table, successfully created it on the initial run and then subsequent runs, it fails?
Yes, it succeeded to construct but failed in subsequent runs.
Did you also change from a different materialization (like table) over to dynamic table?
I tried both, change materialization and create new table, but there is no difference.
When I debug the code, it seems that the show dynamic tables like ~~ with result_scan return each row as tuple.
But parse_relation_results try to get parameters from dict by the key and it returns all parameters as None since the dynamic_table is just tuple without key.
https://github.com/dbt-labs/dbt-snowflake/blob/b687ac477a4e816ab5ea381a3685da1dc886ff87/dbt/adapters/snowflake/relation_configs/dynamic_table.py#L107
Let me try to create sharable DDL.
I could reproduce the issue with following simple definition too.
{{ config(
materialized = 'dynamic_table',
snowflake_warehouse = 'medium',
target_lag = '12 hours',
)
}}
select
test_val
from
{{ ref('test_base') }}
The test_base is simple too:
{{ config(materialized = 'table') }}
select
1 as test_val
Even using your examples, I cannot seem to recreate the error. Could you share the logs of the initial run and then the subsequent one when it errors out, removing any private information?
@amychen1776 Thanks for your quick action. I'm using dbt-core and following is the log. If it's not enough can I get recommended command to get it?
- Initial run (Success)
15:28:22 On model.test_base: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_base"} */
create or replace table db.schema.test_base
as
(
select
1 as test_val
);
15:28:23 SQL status: SUCCESS 1 in 0.865 seconds
15:28:23 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '51f8a887-87e1-4b70-8c69-83a9ecd6e12d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x79ee53fec7d0>]}
15:28:23 1 of 2 OK created sql table model schema.test_base ......................... [SUCCESS 1 in 0.91s]
15:28:23 Finished running node model.test_base
15:28:23 Began running node model.test_dynamic_table2
15:28:23 2 of 2 START sql dynamic_table model schema.test_dynamic_table2 ............ [RUN]
15:28:23 Acquiring new snowflake connection 'model.test_dynamic_table2'
15:28:23 Began compiling node model.test_dynamic_table2
15:28:23 Writing injected SQL for node "model.test_dynamic_table2"
15:28:23 Began executing node model.test_dynamic_table2
15:28:23 Applying CREATE to: db.schema.test_dynamic_table2
15:28:23 Writing runtime sql for node "model.test_dynamic_table2"
15:28:23 Using snowflake connection "model.test_dynamic_table2"
15:28:23 On model.test_dynamic_table2: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.ecp.test_dynamic_table2"} */
create dynamic table db.schema.test_dynamic_table2
target_lag = '12 hours'
warehouse = xsmall
refresh_mode = AUTO
initialize = ON_CREATE
as (
select
test_val
from
db.schema.test_base
)
15:28:23 Opening a new connection, currently in state init
15:28:26 SQL status: SUCCESS 1 in 2.228 seconds
- Subsequent run (failed)
15:33:59 1 of 2 START sql table model schema.test_base .............................. [RUN]
15:33:59 Re-using an available connection from the pool (formerly XXXX, now model.test_base)
15:33:59 Began compiling node model.test_base
15:34:00 Writing injected SQL for node "model.test_base"
15:34:00 Began executing node model.test_base
15:34:00 Writing runtime sql for node "model.test_base"
15:34:00 Using snowflake connection "model.test_base"
15:34:00 On model.test_base: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_base"} */
create or replace table db.schema.test_base
as
(
select
1 as test_val
);
15:34:02 SQL status: SUCCESS 1 in 2.172 seconds
15:34:02 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '0a822d05-fe68-4b90-9955-cb290915e48a', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x725d7ebb9f90>]}
15:34:02 1 of 2 OK created sql table model schema.test_base ......................... [SUCCESS 1 in 2.23s]
15:34:02 Finished running node model.test_base
15:34:02 Began running node model.test_dynamic_table2
15:34:02 2 of 2 START sql dynamic_table model schema.test_dynamic_table2 ............ [RUN]
15:34:02 Acquiring new snowflake connection 'model.test_dynamic_table2'
15:34:02 Began compiling node model.test_dynamic_table2
15:34:02 Writing injected SQL for node "model.test_dynamic_table2"
15:34:02 Began executing node model.test_dynamic_table2
15:34:02 Using snowflake connection "model.test_dynamic_table2"
15:34:02 On model.test_dynamic_table2: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_dynamic_table2"} */
show dynamic tables
like 'TEST_DYNAMIC_TABLE2'
in schema DB.SCHEMA
;
15:34:02 Opening a new connection, currently in state init
15:34:03 SQL status: SUCCESS 1 in 0.922 seconds
15:34:03 Using snowflake connection "model.test_dynamic_table2"
15:34:03 On model.test_dynamic_table2: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_dynamic_table2"} */
select
"name",
"schema_name",
"database_name",
"text",
"target_lag",
"warehouse",
"refresh_mode"
from table(result_scan(last_query_id()))
15:34:03 SQL status: SUCCESS 1 in 0.636 seconds
15:34:03 Using snowflake connection "model.test_dynamic_table2"
15:34:03 On model.test_dynamic_table2: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_dynamic_table2"} */
show iceberg tables
like 'TEST_DYNAMIC_TABLE2'
in schema DB.SCHEMA
;
15:34:04 SQL status: SUCCESS 0 in 0.192 seconds
15:34:04 Using snowflake connection "model.test_dynamic_table2"
15:34:04 On model.test_dynamic_table2: /* {"app": "dbt", "dbt_version": "1.9.1", "profile_name": "XXX", "target_name": "dev", "node_id": "model.test_dynamic_table2"} */
select
"catalog_name",
"external_volume_name",
"base_location"
from table(result_scan(last_query_id()))
15:34:04 SQL status: SUCCESS 0 in 0.399 seconds
15:34:04 Compilation Error in model test_dynamic_table2 (models/debug/test_dynamic_table2.sql)
SnowflakeDynamicTableConfig.__init__() missing 6 required positional arguments: 'name', 'schema_name', 'database_name', 'query', 'target_lag', and 'snowflake_warehouse'
> in macro dynamic_table_get_build_sql (macros/materializations/dynamic_table.sql)
> called by macro materialization_dynamic_table_snowflake (macros/materializations/dynamic_table.sql)
> called by model test_dynamic_table2 (models/debug/test_dynamic_table2.sql)
15:34:04 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '0a822d05-fe68-4b90-9955-cb290915e48a', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x725d4199ff10>]}
15:34:04 2 of 2 ERROR creating sql dynamic_table model perception.test_dynamic_table2 ... [ERROR in 2.20s]
15:34:04 Finished running node model.test_dynamic_table2
15:34:04 Marking all children of 'model.test_dynamic_table2' to be skipped because of status 'error'. Reason: Compilation Error in model test_dynamic_table2 (models/debug/test_dynamic_table2.sql)
SnowflakeDynamicTableConfig.__init__() missing 6 required positional arguments: 'name', 'schema_name', 'database_name', 'query', 'target_lag', and 'snowflake_warehouse'
> in macro dynamic_table_get_build_sql (macros/materializations/dynamic_table.sql)
> called by macro materialization_dynamic_table_snowflake (macros/materializations/dynamic_table.sql)
> called by model test_dynamic_table2 (models/debug/test_dynamic_table2.sql).
@amychen1776 ,
We enable enable_iceberg_materializations: true and I tried to remove it but the results were same...
It removed querying the catalog for iceberg correctly though.
Thank you for your help.
Is this an iceberg dynamic table? Could you provide your exact model configuration for both the dynamic table and the table? Removing that behavior flag should not change anything outside of preventing you from creating a dynamic table in iceberg format.
Could you try to run dbt run -s test_dynamic_table2 --full-refresh and then dbt run -s test_dynamic_table2
Is this an iceberg dynamic table?
No, it's just dynamic table.
with --full-refresh
00:30:41 1 of 1 START hook: elementary.on-run-start.0 ................................... [RUN]
00:30:41 1 of 1 OK hook: elementary.on-run-start.0 ...................................... [OK in 0.02s]
00:30:41
00:30:41 1 of 1 START sql dynamic_table model schema.test_dynamic_table2 ............ [RUN]
00:30:44 1 of 1 OK created sql dynamic_table model schema.test_dynamic_table2 ....... [SUCCESS 1 in 2.19s]
00:30:44
00:30:44 1 of 3 START hook: on-run-end.0 ............................................ [RUN]
00:30:44 1 of 3 OK hook: on-run-end.0 ............................................... [OK in 0.00s]
00:30:44 Running dbt Constraints
00:30:44 Finished dbt Constraints
00:30:44 2 of 3 START hook: dbt_constraints.on-run-end.1 ................................ [RUN]
00:30:44 2 of 3 OK hook: dbt_constraints.on-run-end.1 ................................... [OK in 0.14s]
00:30:49 3 of 3 START hook: elementary.on-run-end.2 ..................................... [RUN]
00:30:49 3 of 3 OK hook: elementary.on-run-end.2 ........................................ [OK in 5.61s]
00:30:51
00:30:51 Finished running 1 dynamic table model, 4 project hooks in 0 hours 0 minutes and 15.12 seconds (15.12s).
00:30:51
00:30:51 Completed successfully
00:30:51
00:30:51 Done. PASS=5 WARN=0 ERROR=0 SKIP=0 TOTAL=5
second run
00:32:46 1 of 1 START hook: elementary.on-run-start.0 ................................... [RUN]
00:32:46 1 of 1 OK hook: elementary.on-run-start.0 ...................................... [OK in 0.03s]
00:32:46
00:32:46 1 of 1 START sql dynamic_table model schema.test_dynamic_table2 ............ [RUN]
00:32:49 1 of 1 ERROR creating sql dynamic_table model schema.test_dynamic_table2 ... [ERROR in 2.91s]
00:32:49
00:32:49 1 of 3 START hook: on-run-end.0 ............................................ [RUN]
00:32:49 1 of 3 OK hook: on-run-end.0 ............................................... [OK in 0.01s]
00:32:49 Running dbt Constraints
00:32:49 Finished dbt Constraints
00:32:49 2 of 3 START hook: dbt_constraints.on-run-end.1 ................................ [RUN]
00:32:49 2 of 3 OK hook: dbt_constraints.on-run-end.1 ................................... [OK in 0.14s]
00:32:53 3 of 3 START hook: elementary.on-run-end.2 ..................................... [RUN]
00:32:53 3 of 3 OK hook: elementary.on-run-end.2 ........................................ [OK in 4.49s]
00:32:55
00:32:55 Finished running 1 dynamic table model, 4 project hooks in 0 hours 0 minutes and 12.81 seconds (12.81s).
00:32:55
00:32:55 Completed with 1 error, 0 partial successes, and 0 warnings:
00:32:55
00:32:55 Compilation Error in model test_dynamic_table2 (models/debug/test_dynamic_table2.sql)
SnowflakeDynamicTableConfig.__init__() missing 6 required positional arguments: 'name', 'schema_name', 'database_name', 'query', 'target_lag', and 'snowflake_warehouse'
> in macro dynamic_table_get_build_sql (macros/materializations/dynamic_table.sql)
> called by macro materialization_dynamic_table_snowflake (macros/materializations/dynamic_table.sql)
> called by model test_dynamic_table2 (models/debug/test_dynamic_table2.sql)
00:32:55
00:32:55 Done. PASS=4 WARN=0 ERROR=1 SKIP=0 TOTAL=5
@kds1010 thank you for sharing. Would you be able to give us the logs for the second run without the iceberg behavior flag on? (similar to what you shared above)? Also after you run the initial run, can you confirm the DT has been created in Snowflake before your second run?
Unfortunately neither our integration tests, myself, or our engineer have been able to recreate this bug and I haven't gotten reports from any other users experiencing the same thing. I also ran this on 1.9 with no issue. To be honest, I'm a bit flummoxed and not quite sure what should be the next steps outside of recommending a clean dbt project to see if your install of dbt.
@amychen1776 Let me prepare the log. The DT itself was created and updated successfully according to the lag.
Thank you for confirming the situation. I found a post in dbt community but they didn't get meaningful reply too here.
@amychen1776 ,
Oh, I've found the solution.
Our account was configured QUOTED_IDENTIFIERS_IGNORE_CASE = true and when I turn it to false, the issue is resolved.
Could you check if you can reproduce it in your env?
There is it! Snowflake quoting at fault again. I was finally able to recreate the error. Thank you so much for all of your hard work looking into this! I'm going to update our documentation to call this out as an issue.
What if I cannot/really don't want to set QUOTED_IDENTIFIERS_IGNORE_CASE = false? We can get into the details of why an org would want to or not, but I don't think that matters. I think it would make sense if a dynamic table can be updated without a full refresh regardless of my snowflake account configuration.
@tommy-instance I think that's a very fair response. I personally do think it should be possible to update it --> this issue is at a metadata level. We do have a bigger spike we need to do about quoting that is ultimately meant to tackle these papercuts.