dbt-snowflake
dbt-snowflake copied to clipboard
[Bug] Dynamic tables with Snowflake change bundle `2024_03` results in dynamic table to issue
Is this a new bug in dbt-snowflake?
- [X] I believe this is a new bug in dbt-snowflake
- [X] I have searched the existing issues, and I could not find an existing issue for this bug
Current Behavior
Originally reported in Slack community: https://getdbt.slack.com/archives/CJN7XRF1B/p1713145308135909 but filing an official repro.
When using Snowflake change bundle 2024_03
(https://docs.snowflake.com/en/release-notes/bcr-bundles/2024_03_bundle), dbt is issuing a drop table ...
for dynamic tables which is resulting in some object already exists
database error.
When the the change bundle is disabled - there is no such drop table ...
statement and thus no database error.
Expected Behavior
Keep to the same behavior even if change bundle 2024_03
is enabled.
Steps To Reproduce
dbt project setup
# dbt_project.yml
name: my_dbt_project
profile: all
config-version: 2
version: "1.0.0"
models:
my_dbt_project:
+materialized: table
Bundle disabled behaviour
-- run on snowflake
USE ROLE ACCOUNTADMIN;
SELECT SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES();
-- [{"name":"2024_02","isDefault":true,"isEnabled":true},{"name":"2024_03","isDefault":false,"isEnabled":false}];
-- SELECT SYSTEM$DISABLE_BEHAVIOR_CHANGE_BUNDLE('2024_03'); -- run this if the 2024_03 bundle is enabled.
Add net new dynamic table (or just make sure it doesn't yet exist) and build twice:
-- models/foo_2024_03_disabled.sql
{{
config(
materialized = 'dynamic_table',
on_configuration_change = 'apply',
target_lag = '1 minutes',
snowflake_warehouse = 'analytics'
)
}}
select * from development_jyeo.dbt_jyeo.foo
$ dbt --debug run
08:25:02 Began running node model.my_dbt_project.foo_2024_03_disabled
08:25:02 1 of 1 START sql dynamic_table model dbt_jyeo.foo_2024_03_disabled ............. [RUN]
08:25:02 Re-using an available connection from the pool (formerly list_development_jyeo_dbt_jyeo, now model.my_dbt_project.foo_2024_03_disabled)
08:25:02 Began compiling node model.my_dbt_project.foo_2024_03_disabled
08:25:02 Writing injected SQL for node "model.my_dbt_project.foo_2024_03_disabled"
08:25:02 Timing info for model.my_dbt_project.foo_2024_03_disabled (compile): 16:25:02.600507 => 16:25:02.610869
08:25:02 Began executing node model.my_dbt_project.foo_2024_03_disabled
08:25:02 Applying CREATE to: development_jyeo.dbt_jyeo.foo_2024_03_disabled
08:25:02 Writing runtime sql for node "model.my_dbt_project.foo_2024_03_disabled"
08:25:02 Using snowflake connection "model.my_dbt_project.foo_2024_03_disabled"
08:25:02 On model.my_dbt_project.foo_2024_03_disabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_disabled"} */
create dynamic table development_jyeo.dbt_jyeo.foo_2024_03_disabled
target_lag = '1 minutes'
warehouse = analytics
as (
select * from development_jyeo.dbt_jyeo.foo
)
08:25:02 Opening a new connection, currently in state closed
08:25:06 SQL status: SUCCESS 1 in 4.0 seconds
08:25:06 Timing info for model.my_dbt_project.foo_2024_03_disabled (execute): 16:25:02.612322 => 16:25:06.604083
08:25:06 On model.my_dbt_project.foo_2024_03_disabled: Close
08:25:07 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '23f4c802-d7e6-4dec-ae6a-d297ebc9905b', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x127173550>]}
08:25:07 1 of 1 OK created sql dynamic_table model dbt_jyeo.foo_2024_03_disabled ........ [SUCCESS 1 in 4.86s]
$ dbt --debug run
08:25:53 1 of 1 START sql dynamic_table model dbt_jyeo.foo_2024_03_disabled ............. [RUN]
08:25:53 Re-using an available connection from the pool (formerly list_development_jyeo_dbt_jyeo, now model.my_dbt_project.foo_2024_03_disabled)
08:25:53 Began compiling node model.my_dbt_project.foo_2024_03_disabled
08:25:53 Writing injected SQL for node "model.my_dbt_project.foo_2024_03_disabled"
08:25:53 Timing info for model.my_dbt_project.foo_2024_03_disabled (compile): 16:25:53.799679 => 16:25:53.810452
08:25:53 Began executing node model.my_dbt_project.foo_2024_03_disabled
08:25:53 Using snowflake connection "model.my_dbt_project.foo_2024_03_disabled"
08:25:53 On model.my_dbt_project.foo_2024_03_disabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_disabled"} */
show dynamic tables
like 'FOO_2024_03_DISABLED'
in schema DEVELOPMENT_JYEO.DBT_JYEO
;
08:25:53 Opening a new connection, currently in state closed
08:25:55 SQL status: SUCCESS 1 in 2.0 seconds
08:25:55 Using snowflake connection "model.my_dbt_project.foo_2024_03_disabled"
08:25:55 On model.my_dbt_project.foo_2024_03_disabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_disabled"} */
select
"name",
"schema_name",
"database_name",
"text",
"target_lag",
"warehouse"
from table(result_scan(last_query_id()))
08:25:56 SQL status: SUCCESS 1 in 1.0 seconds
08:25:56 Applying ALTER to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_DISABLED"
08:25:56 Applying UPDATE TARGET_LAG to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_DISABLED"
08:25:56 Applying UPDATE WAREHOUSE to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_DISABLED"
08:25:56 Writing runtime sql for node "model.my_dbt_project.foo_2024_03_disabled"
08:25:56 Using snowflake connection "model.my_dbt_project.foo_2024_03_disabled"
08:25:56 On model.my_dbt_project.foo_2024_03_disabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_disabled"} */
alter dynamic table "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_DISABLED" set
target_lag = '1 minutes'
warehouse = analytics
08:25:57 SQL status: SUCCESS 1 in 0.0 seconds
08:25:57 Timing info for model.my_dbt_project.foo_2024_03_disabled (execute): 16:25:53.811882 => 16:25:57.329397
08:25:57 On model.my_dbt_project.foo_2024_03_disabled: Close
08:25:58 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'b6cc8575-9748-4550-b15c-87ba8a710064', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x11f43d650>]}
08:25:58 1 of 1 OK created sql dynamic_table model dbt_jyeo.foo_2024_03_disabled ........ [SUCCESS 1 in 4.34s]
Bundle enabled behaviour
-- run on snowflake
USE ROLE ACCOUNTADMIN;
SELECT SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2024_03');
SELECT SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES();
-- [{"name":"2024_02","isDefault":true,"isEnabled":true},{"name":"2024_03","isDefault":false,"isEnabled":true}]
-- models/foo_2024_03_enabled.sql
{{
config(
materialized = 'dynamic_table',
on_configuration_change = 'apply',
target_lag = '1 minutes',
snowflake_warehouse = 'analytics'
)
}}
select * from development_jyeo.dbt_jyeo.foo
$ dbt --debug run
08:29:59 1 of 1 START sql dynamic_table model dbt_jyeo.foo_2024_03_enabled .............. [RUN]
08:29:59 Re-using an available connection from the pool (formerly list_development_jyeo_dbt_jyeo, now model.my_dbt_project.foo_2024_03_enabled)
08:29:59 Began compiling node model.my_dbt_project.foo_2024_03_enabled
08:29:59 Writing injected SQL for node "model.my_dbt_project.foo_2024_03_enabled"
08:29:59 Timing info for model.my_dbt_project.foo_2024_03_enabled (compile): 16:29:59.104627 => 16:29:59.115607
08:29:59 Began executing node model.my_dbt_project.foo_2024_03_enabled
08:29:59 Applying CREATE to: development_jyeo.dbt_jyeo.foo_2024_03_enabled
08:29:59 Writing runtime sql for node "model.my_dbt_project.foo_2024_03_enabled"
08:29:59 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:29:59 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
create dynamic table development_jyeo.dbt_jyeo.foo_2024_03_enabled
target_lag = '1 minutes'
warehouse = analytics
as (
select * from development_jyeo.dbt_jyeo.foo
)
08:29:59 Opening a new connection, currently in state closed
08:30:02 SQL status: SUCCESS 1 in 3.0 seconds
08:30:02 Timing info for model.my_dbt_project.foo_2024_03_enabled (execute): 16:29:59.117644 => 16:30:02.575080
08:30:02 On model.my_dbt_project.foo_2024_03_enabled: Close
08:30:03 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '36f08113-c928-4cc6-a9d8-5ce2ef49ed21', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1263e59d0>]}
08:30:03 1 of 1 OK created sql dynamic_table model dbt_jyeo.foo_2024_03_enabled ......... [SUCCESS 1 in 4.20s]
$ dbt --debug run
08:30:39 Began running node model.my_dbt_project.foo_2024_03_enabled
08:30:39 1 of 1 START sql dynamic_table model dbt_jyeo.foo_2024_03_enabled .............. [RUN]
08:30:39 Re-using an available connection from the pool (formerly list_development_jyeo_dbt_jyeo, now model.my_dbt_project.foo_2024_03_enabled)
08:30:39 Began compiling node model.my_dbt_project.foo_2024_03_enabled
08:30:39 Writing injected SQL for node "model.my_dbt_project.foo_2024_03_enabled"
08:30:39 Timing info for model.my_dbt_project.foo_2024_03_enabled (compile): 16:30:39.353783 => 16:30:39.364436
08:30:39 Began executing node model.my_dbt_project.foo_2024_03_enabled
08:30:39 Applying REPLACE to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:30:39 Applying CREATE BACKUP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:30:39 Applying DROP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup"
08:30:39 Applying RENAME to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:30:39 Applying CREATE to: development_jyeo.dbt_jyeo.foo_2024_03_enabled
08:30:39 Applying DROP BACKUP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:30:39 Applying DROP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup"
08:30:39 Writing runtime sql for node "model.my_dbt_project.foo_2024_03_enabled"
08:30:39 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:30:39 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
-- get the standard backup name
-- drop any pre-existing backup
drop table if exists "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup" cascade
;
08:30:39 Opening a new connection, currently in state closed
08:30:41 SQL status: SUCCESS 1 in 2.0 seconds
08:30:41 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:30:41 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
alter table "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED" rename to FOO_2024_03_ENABLED__dbt_backup;
08:30:41 SQL status: SUCCESS 1 in 0.0 seconds
08:30:41 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:30:41 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
create dynamic table development_jyeo.dbt_jyeo.foo_2024_03_enabled
target_lag = '1 minutes'
warehouse = analytics
as (
select * from development_jyeo.dbt_jyeo.foo
)
;
08:30:43 SQL status: SUCCESS 1 in 1.0 seconds
08:30:43 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:30:43 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
-- get the standard backup name
drop table if exists "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup" cascade
08:30:43 SQL status: SUCCESS 1 in 0.0 seconds
08:30:43 Timing info for model.my_dbt_project.foo_2024_03_enabled (execute): 16:30:39.366483 => 16:30:43.812942
08:30:43 On model.my_dbt_project.foo_2024_03_enabled: Close
08:30:44 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '61d9360c-e683-4810-a25b-7e48e0cdb830', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x122b59290>]}
08:30:44 1 of 1 OK created sql dynamic_table model dbt_jyeo.foo_2024_03_enabled ......... [SUCCESS 1 in 5.25s]
$ dbt --debug run
08:31:27 Began running node model.my_dbt_project.foo_2024_03_enabled
08:31:27 1 of 1 START sql dynamic_table model dbt_jyeo.foo_2024_03_enabled .............. [RUN]
08:31:27 Re-using an available connection from the pool (formerly list_development_jyeo_dbt_jyeo, now model.my_dbt_project.foo_2024_03_enabled)
08:31:27 Began compiling node model.my_dbt_project.foo_2024_03_enabled
08:31:27 Writing injected SQL for node "model.my_dbt_project.foo_2024_03_enabled"
08:31:27 Timing info for model.my_dbt_project.foo_2024_03_enabled (compile): 16:31:27.209372 => 16:31:27.222002
08:31:27 Began executing node model.my_dbt_project.foo_2024_03_enabled
08:31:27 Applying REPLACE to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:31:27 Applying CREATE BACKUP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:31:27 Applying DROP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup"
08:31:27 Applying RENAME to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:31:27 Applying CREATE to: development_jyeo.dbt_jyeo.foo_2024_03_enabled
08:31:27 Applying DROP BACKUP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED"
08:31:27 Applying DROP to: "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup"
08:31:27 Writing runtime sql for node "model.my_dbt_project.foo_2024_03_enabled"
08:31:27 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:31:27 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
-- get the standard backup name
-- drop any pre-existing backup
drop table if exists "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED__dbt_backup" cascade
;
08:31:27 Opening a new connection, currently in state closed
08:31:29 SQL status: SUCCESS 1 in 2.0 seconds
08:31:29 Using snowflake connection "model.my_dbt_project.foo_2024_03_enabled"
08:31:29 On model.my_dbt_project.foo_2024_03_enabled: /* {"app": "dbt", "dbt_version": "1.7.14", "profile_name": "all", "target_name": "sf", "node_id": "model.my_dbt_project.foo_2024_03_enabled"} */
alter table "DEVELOPMENT_JYEO"."DBT_JYEO"."FOO_2024_03_ENABLED" rename to FOO_2024_03_ENABLED__dbt_backup;
08:31:29 Snowflake adapter: Snowflake query id: 01b413bf-0804-98fe-000d-3783323f397a
08:31:29 Snowflake adapter: Snowflake error: 002002 (42710): SQL compilation error:
Object 'FOO_2024_03_ENABLED__DBT_BACKUP' already exists.
08:31:29 Timing info for model.my_dbt_project.foo_2024_03_enabled (execute): 16:31:27.223905 => 16:31:29.755031
08:31:29 On model.my_dbt_project.foo_2024_03_enabled: Close
08:31:30 Database Error in model foo_2024_03_enabled (models/foo_2024_03_enabled.sql)
002002 (42710): SQL compilation error:
Object 'FOO_2024_03_ENABLED__DBT_BACKUP' already exists.
compiled Code at target/run/my_dbt_project/models/foo_2024_03_enabled.sql
08:31:30 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '0e963909-1e62-4110-97de-f7deaa8d35c3', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1246e8550>]}
08:31:30 1 of 1 ERROR creating sql dynamic_table model dbt_jyeo.foo_2024_03_enabled ..... [ERROR in 3.32s]
Relevant log output
See above.
Environment
- OS: macOS
- Python: Python 3.11.2
- dbt-core: 1.7.14
- dbt-snowflake: 1.7.3
Additional Context
Workaround: disable the 2024_03
bundle.
As of right now - there appears to be a new 2024_04
bundle. No idea how things are affected yet.
use role accountadmin;
select SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES();
-- [{"name":"2024_03","isDefault":true,"isEnabled":false},{"name":"2024_04","isDefault":false,"isEnabled":false}]
create or replace dynamic table development_jyeo.dbt_jyeo.foo_dt target_lag = '1 minutes' warehouse = analytics as (
select 1 c from development_jyeo.dbt_jyeo.foo
);
select table_catalog, table_schema, table_name, is_dynamic, SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES() as bundle_status
from development_jyeo.information_schema.tables
where table_name = 'FOO_DT';
/*
TABLE_CATALOG,TABLE_SCHEMA,TABLE_NAME,IS_DYNAMIC,BUNDLE_STATUS
DEVELOPMENT_JYEO,DBT_JYEO,FOO_DT,YES,"[{""name"":""2024_03"",""isDefault"":true,""isEnabled"":false},{""name"":""2024_04"",""isDefault"":false,""isEnabled"":false}]"
*/
select SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2024_03');
select table_catalog, table_schema, table_name, is_dynamic, SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES() as bundle_status
from development_jyeo.information_schema.tables
where table_name = 'FOO_DT';
/*
TABLE_CATALOG,TABLE_SCHEMA,TABLE_NAME,IS_DYNAMIC,BUNDLE_STATUS
DEVELOPMENT_JYEO,DBT_JYEO,FOO_DT,YES,"[{""name"":""2024_03"",""isDefault"":true,""isEnabled"":true},{""name"":""2024_04"",""isDefault"":false,""isEnabled"":false}]"
*/
select SYSTEM$DISABLE_BEHAVIOR_CHANGE_BUNDLE('2024_03');
select SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2024_04');
select table_catalog, table_schema, table_name, is_dynamic, SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES() as bundle_status
from development_jyeo.information_schema.tables
where table_name = 'FOO_DT';
/*
TABLE_CATALOG,TABLE_SCHEMA,TABLE_NAME,IS_DYNAMIC,BUNDLE_STATUS
DEVELOPMENT_JYEO,DBT_JYEO,FOO_DT,YES,"[{""name"":""2024_03"",""isDefault"":true,""isEnabled"":false},{""name"":""2024_04"",""isDefault"":false,""isEnabled"":true}]"
*/
select SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2024_03');
select SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2024_04');
select table_catalog, table_schema, table_name, is_dynamic, SYSTEM$SHOW_ACTIVE_BEHAVIOR_CHANGE_BUNDLES() as bundle_status
from development_jyeo.information_schema.tables
where table_name = 'FOO_DT';
/*
TABLE_CATALOG,TABLE_SCHEMA,TABLE_NAME,IS_DYNAMIC,BUNDLE_STATUS
DEVELOPMENT_JYEO,DBT_JYEO,FOO_DT,YES,"[{""name"":""2024_03"",""isDefault"":true,""isEnabled"":true},{""name"":""2024_04"",""isDefault"":false,""isEnabled"":true}]"
*/
I happened to noticed that as well yesterday seems to still be in an early build so not set to default yet would have to opt in I believe.
current logs for that bundle: https://docs.snowflake.com/en/release-notes/bcr-bundles/2024_04_bundle
ok after a deep dive with @mikealfare and @McKnight-42 we discovered the root cause was an undocumented change in the 2024_03
bundle about the SHOW OBJECTS
statement.
Prior to the change, SHOW TERSE OBJECTS
would return kind
column whose value would return DYNAMIC_TABLE
. Once the 2024_03
bundle is enabled, the kind
returns TABLE
.
The ostensible workaround is to use SHOW OBJECTS
which now includes a is_dynamic
column (SHOW OBJECTS command: New column and changes to output
However, doing so will undoubtedly affect performance. I'm not sure how to conditionally exclude TERSE
when Dynamic Tables are concerned...
also got to callout that this same issue with the wrong info in the SHOW OBJECTS
kind
column was encountered before with Snowflake materialized views in https://github.com/dbt-labs/dbt-labs-experimental-features/issues/14
We're working with the Snowflake team on a long-term resolution path. Worst case, we will need to switch from using show terse objects
to show objects
here, with some additional query latency, so that we can access the is_dynamic
column within list_relations_without_caching
.
In the meantime, there are two viable workarounds:
- (1) Disable the
2024_03
behavior change bundle in your Snowflake account. This will work until the2024_03
bundle becomes Generally Enabled in June (= no longer possible to opt out).
use role accountadmin;
select system$disable_behavior_change_bundle('2024_03');
- (2) Copy-paste the code below and stick it in your root project's
macros/
folder. This reimplements thedynamic_table
materialization by adding an additional check for the type of the existing_relation by runningshow dynamic tables
. This will work in cases where you're refreshing a dynamic table (most important), or replacing a (simple) table with a dynamic table, but it won't work in cases where you're replacing a dynamic table with a simple table. We need a more durable long-term solution, but I believe it's the fastest and simplest way to unblock existing users right now. As soon as we have the long-term solution, users should remember to remove this materialization override from their projects.
{% materialization dynamic_table, adapter='snowflake' %}
{% set query_tag = set_query_tag() %}
{% set existing_relation = load_cached_relation(this) %}
{% set target_relation = this.incorporate(type=this.DynamicTable) %}
{{ run_hooks(pre_hooks) }}
-- >>>>>>
-- Snowflake introduced a change to 'show objects' in the 2024_03 behavior change bundle
-- such that dynamic tables now simply appear as 'table'.
-- If we think that existing_relation is a table, we need to double check by running
-- an additional 'show dynamic tables' query.
{% if existing_relation.is_table %}
{% set is_dynamic_table = (snowflake__describe_dynamic_table(existing_relation).dynamic_table.rows | length > 0) %}
{% if is_dynamic_table %}
{% set existing_relation = existing_relation.incorporate(type=this.DynamicTable) %}
{% endif %}
{% endif %}
-- <<<<<<
{% set build_sql = dynamic_table_get_build_sql(existing_relation, target_relation) %}
{% if build_sql == '' %}
{{ dynamic_table_execute_no_op(target_relation) }}
{% else %}
{{ dynamic_table_execute_build_sql(build_sql, existing_relation, target_relation) }}
{% endif %}
{{ run_hooks(post_hooks) }}
{% do unset_query_tag(query_tag) %}
{{ return({'relations': [target_relation]}) }}
{% endmaterialization %}
Is it possible to create a new materialization with the above macro? Something like
{% materialization dynamic_table_stopgap, adapter='snowflake' %}
but everything else is the same. And then materialize models with config(materialized='dynamic_table_stopgap', ...)
?
I'd like to add one more thing. I was getting errors in CI Jobs like this 090106 (22000): Cannot perform CREATE TABLE. This session does not have a current schema. Call 'USE SCHEMA', or use a qualified name.
but after after running select system$disable_behavior_change_bundle('2024_03');
this stopped happening, so I figured they are probably close related.
Is it possible to create a new materialization with the above macro? Something like
{% materialization dynamic_table_stopgap, adapter='snowflake' %}
but everything else is the same. And then materialize models with
config(materialized='dynamic_table_stopgap', ...)
?
yep that should work! DM me know if it doesn't
Proposed next steps:
- https://github.com/dbt-labs/dbt-snowflake/issues/1038
Re-opening for backports