Matheus Facure
Matheus Facure
Very interesting discussion. I don't know if I can commit to a fix, as the issue is actively being discussed. I just uploaded an appendix which leveraged conformal inference for...
It doesn't! It learns a local linear cate. I try to explain that in the following section. Have you read it? If it's not clear, let me know.
You are correct. X is what goes to the final model as the features. I can't find that piece of code. Can you point me to it? Here is what...
Oh, I see. Thats a bug :) Since there are no features in this case, X should only be a constant hehe. I'll fix it.
it should be ```python debias_m = LGBMRegressor(max_depth=3) denoise_m = LGBMRegressor(max_depth=3) # orthogonalising step discount_res = discount.ravel() - cross_val_predict(debias_m, np.ones(discount.shape), discount.ravel(), cv=5) sales_res = sales.ravel() - cross_val_predict(denoise_m, np.ones(sales.shape), sales.ravel(), cv=5) #...
Fair point. This issue will take more time to be fully fixed. I'm writing a section on panel data that will address it more thoroughly.
You are correct. This part needs some rewriting because its is very confusing. What I meant there was that doing `p1 (cop1-cop2)` was the problem. This is more common than...
Honestly, when I wrote this, all of it was completely from my head. Causal inf. is evolving rapidly, so it's no surprise that recent material has emerged on the topic....
> Comparing changes in both dimensions attributes the differing segmentations to the architecture of the model rather than the type of prediction. I agree with that, but the point here...
WOW! What is this magic? How does map works?