Mike Anderson
Mike Anderson
I think you want optimal_allocation_per_timeunit to be media_scaler.transform(optimal_budget_allocation)/n_time_periods, but I agree this is unclear and we'll try to improve this workflow. Thanks for flagging this!
We're working internally on some larger changes which might help with this, but in the mean time I'd probably try some simple things like switching to weekly granularity rather than...
hi virithavanama and shekharkhandelwal1983! For the plot_model_fit plot, both the target curve and the posterior predictions are produced by [lines 662-666](https://github.com/google/lightweight_mmm/blob/main/lightweight_mmm/plot.py#L662) of plot.py, so you can just replicate those in...
Thanks a lot for flagging this! I just wanted to share a link to our new "[support](https://github.com/google/lightweight_mmm#support)" section in the readme to give you some more information on how we...
Hi @steven-struglia ! Setting a more informative prior sounds like a reasonable thing to do here. I'd also recommend checking the correlation coefficients and VIF values for this channel (see...
I think so! I'd have to replicate it with mock data to see if the format is precisely correct, but that looks by eye like a reasonable way to do...
try: c1 = np.array([2] * len(media_columns) c1[7] = 1.5 c0 = np.ones(len(media_columns) custom_priors = {'lag_weight': {'concentration1': c1, 'concentration0': c0}}
This is unfortunately going to depend on your specific datasets, but there's a lot of discussion about these functional shapes and a bunch of examples in the [original Google paper](https://research.google/pubs/pub46001/)...
Thanks a lot for flagging this! I just wanted to share a link to our new "[support](https://github.com/google/lightweight_mmm#support)" section in the readme to give you some more information on how we...
thanks for flagging this! We just pushed a [new commit](https://github.com/google/lightweight_mmm/commit/1b3654c5759ecb3dc94a45fefe1550070acfe3b5) which should hopefully fix it.