grf icon indicating copy to clipboard operation
grf copied to clipboard

Parameter tuning for RDD / lm_forest

Open corydeburd opened this issue 2 years ago • 5 comments
trafficstars

I wanted to check whether the solution for parameter tuning proposed here (https://github.com/grf-labs/grf/issues/1195) would be valid for the regression discontinuity case, paired with lm_forest() as in the example below. As with the previous link, this method does not currently have a setting to automatically tune parameters. https://grf-labs.github.io/grf/reference/lm_forest.html

Does the code / intuition in the original post still apply here? That is, with lm_forest() and non-binary "treatments" (i.e., the RD running variable slopes), is this MSE still the object to consider?

corydeburd avatar Feb 05 '23 02:02 corydeburd

Hi @corydeburd,

That MSE is a reasonable object to consider. Another approach is to treat RDD Forest purely as a data-driven algorithm to find heterogeneous subgroups, i.e:

Split data into training/evaluation

  1. On train data fit an rdd forest
  2. On evaluation data predict treatment effects and form for example groups based on which quantile of the RDD CATEs the unit belongs to. Then, in each of these groups, fit an RDD coefficient using your favorite RDD method. If step 1 was successful, you'd ideally see different RDD estimates in the low and high group corresponding to a low and high effect.

Doing many runs of 1) with different tuning parameters is fair in the sense that inference should still be valid for the RDD coefficients you estimate in 2) since it is a tuned algorithm that discovers the subgroups on a held out data set.

erikcs avatar Mar 22 '23 17:03 erikcs

Thanks, this is a great suggestion. Actually, I had adopted something like this approach, so it's good to know it comes recommended! Holding out is very important as I think it's very easy to overfit in my situation [it's an RD, so observations near the cutoff have a lot of weight]

corydeburd avatar Mar 22 '23 17:03 corydeburd

Dear @erikcs and @corydeburd ,

This thread is very helpful. Thank you. I am also trying to tune parameters in lm_foest() for RDD analysis. Could I have your advice?

  1. Should I split the data into trainning/evaluation with equal sizes?
  2. Are there recommended parameter spaces (min/max) that I should try?

yusukematsuyama avatar Dec 15 '23 05:12 yusukematsuyama

Hi @yusukematsuyama, there's no fixed rule for the train/test, 50/50 and 70/30 are just some common choices. Forests are usually robust wrt tuning parameters, it's hard to say which range of parameters is reasonable.

erikcs avatar Dec 20 '23 06:12 erikcs

Dear @erikcs,

Thank you for your advice. I will try that!

yusukematsuyama avatar Dec 21 '23 00:12 yusukematsuyama