Claus Herther
Claus Herther
I think this has more to do with Snowflake than the macros. On Snowflake we basically call ```sql date_part('weekiso', date'2012-12-31') ``` which returns `1` on my instance with default settings...
> Hey @clausherther > > As discussed on slack - this is a macro I built that I found very helpful for things like customer lifetime analysis. > > Let...
Hi @jpmmcneill, thanks for this! I think we will also need an integration test for this. Let me know if you need help getting that set up. Before we get...
Hi @jpmmcneill just checking in if you had thoughts on my comments above?
Hi @jpmmcneill! I'm going to close this one since it's a bit stale by now, but feel free to reopen if you want to push this one over the finish...
Hi @sisu-callum ! Interesting - I have not yet had a chance to look at the`dbt_metrics` package. Are you saying that model macros like `get_date_dimension` generate column names that would...
> customers with one purchase should be given a recency of 0 I think you meant **frequency** of 0?
The `plot_probability_alive_matrix` function computes P(alive) for all **theoretical** combinations of recency and frequency up to the maxima in your datasets given the parameters computed during the `fit` step. It assumes...
Hi @alison985! Sorry, late response! This is indeed a giant pain, as is really anything date/time related :/ How often do you need to run the same dbt-expectations code against...
@alison985 I totally forgot that in our own integration testing (i.e. where we test all the tests against various adapters), we use this config: ```yaml column_type_list: [date, "{{ dbt_expectations.type_timestamp() }}"]...