dbt-snowflake-monitoring icon indicating copy to clipboard operation
dbt-snowflake-monitoring copied to clipboard

Incrementalize cost per query

Open ian-whitestone opened this issue 2 years ago • 2 comments

For a query history with >100M queries - model takes 8 min to run on an x-small.

If someone runs this hourly on an XSMALL, it will cost ~2.3K/year (2/60*8*24*365).

Also add:

  • [x] Add a ran_on_warehouse variable instead of doing warehouse_size is not null #62
  • [ ] Review execution_start_time logic across all queries. Do we need to add in blocked_time? anything else?

ian-whitestone avatar Dec 15 '22 16:12 ian-whitestone

Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.

ian-whitestone avatar Jan 06 '23 15:01 ian-whitestone

Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.

nvm, cartesian join isn't the problem.

CleanShot 2023-01-06 at 17 12 46@2x

ian-whitestone avatar Jan 07 '23 00:01 ian-whitestone