Seppo Enarvi

Results 88 comments of Seppo Enarvi

The purpose of `save_hyperparameters()` is to save the constructor arguments of the model class to the checkpoint, so that we can construct a model when we load the checkpoint. Usually...

Started using Kubeflow after a long while and ran into this problem again with Kubeflow 1.8.22. I think this workaround should work: - Make the default value of all Bool...

The workaround gets tricky if you want to pass a boolean pipeline parameter to the component. I thought it would be possible (although ugly) using two `dsl.Condition`s in the pipeline:...

> I think this is ok, but my doubt with forcing `use_buffers` to be true is what happens when a user has a module with buffers in it that are...

Hi @cyanic-selkie During training (stage=fit), the actual LightningModule is what we update using the optimizer (I call it the current model) and an AveragedModel is maintained in the background (I...

The user can now provide either the `update_on_step` or the `update_on_epoch` argument. (In theory also both.) It should be a function that takes the step/epoch number and returns `True` if...

I marked this ready for review. There were no comments whether it's a problem that we force `use_buffers=True`. Would it make sense to merge this now and perhaps introduce such...

> The other question I have (for the future) is related to fitting both models on GPU. It may make sense to give the ability to keep the AveragedModel on...

> Hi! Thanks for this great PR. The current implementation only leverages `avg_fn` argument should it also consider the in-place version `multi_avg_fn` ? I think we could just pass `**averaged_model_kwargs`....