adaptdl icon indicating copy to clipboard operation
adaptdl copied to clipboard

Integrating with PyTorch Lightning

Open jaywonchung opened this issue 2 years ago • 4 comments

Hi,

I'm seeing if it's possible to integrate adaptdl with PyTorch Lightning (specifically the Deepspeech2 open source repo). Potential problems I see are:

  1. Would the adaptdl specific the dataloader and the model be compatible?
  2. Also, what should I do about the remaining_epochs_until iterator? Or, if I give up the remaining_epochs_until iterator and stop training at a specific validation metric, would that work?

Thanks a lot.

jaywonchung avatar Apr 16 '22 23:04 jaywonchung

Hi! I think integrating with Lightning would be a great idea. For your two questions:

  1. Not sure about Lightning, but from our experience integrating with other frameworks, this is a likely area of friction. Usually, the model may be integrated just fine, but the data loading logic is often very diverse between different frameworks and require s few work-arounds.
  2. I think the remaining_epochs_until iterator is still required, though in the future AdaptDL should relax this constraint. You should still be able to break out of the loop at any time though, i.e. when a certain validation metric is reached.

aurickq avatar May 25 '22 19:05 aurickq

@jaywonchung Did you make it work and did you publish it? I am interested in this as well.

Maximilian6 avatar Apr 28 '24 20:04 Maximilian6

Nah, I just reverted to an older version of Deepspeech2 that didn't use PyTorch Lightning and integrated adaptdl there.

jaywonchung avatar Apr 28 '24 20:04 jaywonchung

Too bad, but thank you for the fast reply :)

Maximilian6 avatar Apr 28 '24 20:04 Maximilian6