verstack
verstack copied to clipboard
Add cross validation to LGBMTuner
Some sort of LGBMTunerCV being able to customise the number of CVs would be great.
Yeah, this is the second request on this matter in the past few months. Working on implementing the sklearn cv modules into LGBMTuner pipeline.
One thing to have in mind - if you don't have a specific need for custom validation strategy and just want to make you random tuning validation more robust - LGBMTuner has this covered as is: every trial of hyperparameters optimisation is carried out on a new random split. This way this is very similar to a cross-validation approach you are seeking. If you are using LGBMTuner with default parameters (200 trials) - that means you will have 200 random train/valid splits during the tuning process.
Does that answers your question, or you have specific reasons for custom validation strategy?