grf
grf copied to clipboard
Parameter tuning for RDD / lm_forest
I wanted to check whether the solution for parameter tuning proposed here (https://github.com/grf-labs/grf/issues/1195) would be valid for the regression discontinuity case, paired with lm_forest() as in the example below. As with the previous link, this method does not currently have a setting to automatically tune parameters. https://grf-labs.github.io/grf/reference/lm_forest.html
Does the code / intuition in the original post still apply here? That is, with lm_forest() and non-binary "treatments" (i.e., the RD running variable slopes), is this MSE still the object to consider?
Hi @corydeburd,
That MSE is a reasonable object to consider. Another approach is to treat RDD Forest purely as a data-driven algorithm to find heterogeneous subgroups, i.e:
Split data into training/evaluation
- On train data fit an rdd forest
- On evaluation data predict treatment effects and form for example groups based on which quantile of the RDD CATEs the unit belongs to. Then, in each of these groups, fit an RDD coefficient using your favorite RDD method. If step 1 was successful, you'd ideally see different RDD estimates in the low and high group corresponding to a low and high effect.
Doing many runs of 1) with different tuning parameters is fair in the sense that inference should still be valid for the RDD coefficients you estimate in 2) since it is a tuned algorithm that discovers the subgroups on a held out data set.
Thanks, this is a great suggestion. Actually, I had adopted something like this approach, so it's good to know it comes recommended! Holding out is very important as I think it's very easy to overfit in my situation [it's an RD, so observations near the cutoff have a lot of weight]
Dear @erikcs and @corydeburd ,
This thread is very helpful. Thank you. I am also trying to tune parameters in lm_foest() for RDD analysis. Could I have your advice?
- Should I split the data into trainning/evaluation with equal sizes?
- Are there recommended parameter spaces (min/max) that I should try?
Hi @yusukematsuyama, there's no fixed rule for the train/test, 50/50 and 70/30 are just some common choices. Forests are usually robust wrt tuning parameters, it's hard to say which range of parameters is reasonable.
Dear @erikcs,
Thank you for your advice. I will try that!