towerpark

Results 13 comments of towerpark

> +1 > Just discovered that I must have accidentally hit this button (don't know since when). It should be something that only temporarily is in effect. Right now it...

Hi, I have a question about the Step LR scheduler, which I'm making a PR for. My question is, say the initial learning rate and gamma used to create the...

@rubenjr0 I do think following existing code for consistency is important, and that's why I'm here asking :)

> We could possibly add a `scheduler.lr()` function to the scheduler trait which would return the current LR. That way, the initial LR can be retrieved before the first `scheduler.step()`....

I second making all the `.step()` methods return the initial value on the first call if we're going to make the behavior consistent across all the schedulers, because: - Pros:...

I would be happy to make a PR for this change :) I will start working on it in the next few days.

I have a question about what should be included when saving a scheduler into a record: Should I save all the struct fields or only fields that change across steps...

> Normally records are supposed to be combined with hyper-parameters, which are supposed to be training configs. Thank you for addressing my question. Since I need to touch the `.to_record()`...

Thanks for adding that. I should have included the link in the post 😅

@nathanielsimard Would it be all right if I look into this issue? I'm not sure if it's fine to do so, since it has already been assigned.