pytorch_cnn_trainer icon indicating copy to clipboard operation
pytorch_cnn_trainer copied to clipboard

LR Finder

Open oke-aditya opened this issue 5 years ago • 6 comments

🚀 Feature

Try to integrate this LR finder repo. It has just one file https://github.com/davidtvs/pytorch-lr-finder/blob/master/torch_lr_finder/lr_finder.py,

Simply we need to edit this file at some places to support native amp with torch 1.6. Also we need to edit at some places where it supports torch < 1.1. PyTorch CNN Trainer repo supports from torch 1.6 +

https://github.com/davidtvs/pytorch-lr-finder

Looks simple, can be done and tested too.

oke-aditya avatar Jul 23 '20 17:07 oke-aditya

Hey! I'd like to take this up. I've just familiarized myself with Deep Learning concepts, this seems like a good way to learn. Any suggestions?

artorias111 avatar Sep 12 '20 05:09 artorias111

You can take this up surely. It's up for grabs !

LR finder is a standard technique whose implementation I have pointed out. Most code is going to be edit of that.

Seems simple, make sure you write the implementation and tests for it as well. Use pytest for tests.

This goes in a separate file called lr_finder.py since it is not role of trainer exactly.

Have a go. I will review it. You can discuss your solution and implementation before hacktober.

oke-aditya avatar Sep 12 '20 05:09 oke-aditya

Hey, so this: https://github.com/davidtvs/pytorch-lr-finder shows two methods of how LR finder is used.

We take the model from /pytorch_cnn_trainer/model_factory.py and make it work on our implementation of LR finder in this repository. That should work, right?

artorias111 avatar Oct 01 '20 07:10 artorias111

Yes, I guess you get the point. But let me clarify what exactly can be done.

It has a class for LR finder on this line. I want that to be working with models in this repository, I think the LinearLR and ExponentialLR can be used directly from PyTorch itself. Here is source code from PyTorch

It can be used as from torch.optim.lr_schedular import EponentialLR so we can avoid that code.

Make sure that implementation works with PyTorch 1.6 + as this repository is built for 1.6 + only.

oke-aditya avatar Oct 01 '20 08:10 oke-aditya

Alright got it. However, the Arguments for both linearLR and ExponentialLR are different, as shown in https://github.com/davidtvs/pytorch-lr-finder/blob/master/torch_lr_finder/lr_finder.py#L578 and here: https://github.com/pytorch/pytorch/blob/master/torch/optim/lr_scheduler.py#L434

artorias111 avatar Oct 02 '20 03:10 artorias111

I would stick to PyTorch. This library has to work with torch. 😀 Give a try lets see if we get bugs.

oke-aditya avatar Oct 02 '20 06:10 oke-aditya