pytorch_cnn_trainer
pytorch_cnn_trainer copied to clipboard
LR Finder
🚀 Feature
Try to integrate this LR finder repo. It has just one file https://github.com/davidtvs/pytorch-lr-finder/blob/master/torch_lr_finder/lr_finder.py,
Simply we need to edit this file at some places to support native amp with torch 1.6. Also we need to edit at some places where it supports torch < 1.1. PyTorch CNN Trainer repo supports from torch 1.6 +
https://github.com/davidtvs/pytorch-lr-finder
Looks simple, can be done and tested too.
Hey! I'd like to take this up. I've just familiarized myself with Deep Learning concepts, this seems like a good way to learn. Any suggestions?
You can take this up surely. It's up for grabs !
LR finder is a standard technique whose implementation I have pointed out. Most code is going to be edit of that.
Seems simple, make sure you write the implementation and tests for it as well. Use pytest for tests.
This goes in a separate file called lr_finder.py since it is not role of trainer exactly.
Have a go. I will review it. You can discuss your solution and implementation before hacktober.
Hey, so this: https://github.com/davidtvs/pytorch-lr-finder shows two methods of how LR finder is used.
We take the model from /pytorch_cnn_trainer/model_factory.py and make it work on our implementation of LR finder in this repository. That should work, right?
Yes, I guess you get the point. But let me clarify what exactly can be done.
It has a class for LR finder on this line.
I want that to be working with models in this repository, I think the LinearLR and ExponentialLR can be used directly from PyTorch itself. Here is source code from PyTorch
It can be used as from torch.optim.lr_schedular import EponentialLR so we can avoid that code.
Make sure that implementation works with PyTorch 1.6 + as this repository is built for 1.6 + only.
Alright got it. However, the Arguments for both linearLR and ExponentialLR are different, as shown in https://github.com/davidtvs/pytorch-lr-finder/blob/master/torch_lr_finder/lr_finder.py#L578 and here: https://github.com/pytorch/pytorch/blob/master/torch/optim/lr_scheduler.py#L434
I would stick to PyTorch. This library has to work with torch. 😀 Give a try lets see if we get bugs.