dbt-databricks icon indicating copy to clipboard operation
dbt-databricks copied to clipboard

Randomly fails on writing parquet table to a external location

Open sugendran opened this issue 6 months ago • 5 comments

Describe the bug

We've started getting these errors in our DBT pipeline with no changes to the pipeline itself. It's been running for a while now without any problem. Not sure on how to debug this.

01:48:16    Runtime Error in model dim_suppliers (models/marts/core/dim_suppliers.sql)
  CREATE-TABLE-AS-SELECT cannot create table with location to a non-empty directory s3://ordermentum-data/publish/production/core/dim_suppliers. To allow overwriting the existing non-empty directory, set 'spark.sql.legacy.allowNonEmptyLocationInCTAS' to true.

The actual table it fails on changes with each run.

Our config is:

+materialized: table
+file_format: parquet
+location_root: "{{ env_var('publishLocation', 's3://ordermentum-data/publish/dev') ~ '/core' }}"

System information

The output of dbt --version:

from the query run in the sql datawarehouse

"app": "dbt", "dbt_version": "1.8.5", "dbt_databricks_version": "1.8.5", "databricks_sql_connector_version": "3.1.2",

sugendran avatar Aug 16 '24 02:08 sugendran