dbt-snowflake-monitoring
dbt-snowflake-monitoring copied to clipboard
Incrementalize cost per query
For a query history with >100M queries - model takes 8 min to run on an x-small.
If someone runs this hourly on an XSMALL, it will cost ~2.3K/year (2/60*8*24*365).
Also add:
- [x] Add a
ran_on_warehousevariable instead of doingwarehouse_size is not null#62 - [ ] Review
execution_start_timelogic across all queries. Do we need to add inblocked_time? anything else?
Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.
Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.
nvm, cartesian join isn't the problem.