Ensemble-Pytorch icon indicating copy to clipboard operation
Ensemble-Pytorch copied to clipboard

Allow Continuation of Training

Open jtpdowns opened this issue 2 years ago • 2 comments

It appears that the fit method for ensembles is also where the estimators are instantiated. It would be convenient (for example for fine-tuning pretrained ensembles) if the instantiation and training happened in separate steps. Would it be possible either to decouple the instantiation and training steps to allow for the continuation of training? Is the functionality for continuation of training already available in some other way?

jtpdowns avatar Mar 28 '23 19:03 jtpdowns

It seems like this might be straightforward to implement for any class where all estimators are initialized at once (ie I think adversarial, bagging, fusion, gradient boosting, soft gradient boosting, and voting):

    # Instantiate a pool of base estimators, optimizers, and schedulers.
    estimators = []
    for _ in range(self.n_estimators):
        estimators.append(self._make_estimator())

For fast geometric and snapshot, it seems like you could still manage a list and just update from the last element in the list for continuation (instantiating the first estimator into an otherwise empty list for a new ensemble).

jtpdowns avatar Mar 28 '23 19:03 jtpdowns

Hi @jtpdowns, I think you are right. It would be convenient if we could decouple the training part and the model initialization part. Will appreciate a pull request very much.

xuyxu avatar Mar 29 '23 14:03 xuyxu