spotlight icon indicating copy to clipboard operation
spotlight copied to clipboard

How to implement other optimization function e.g. SGD

Open TedSIWEILIU opened this issue 6 years ago • 2 comments

Hi, I read through issues 22&23 but still couldn't find the clue to change the default optimizer Adam to torch.optim.SGD. I tried

emodel = ExplicitFactorizationModel(n_iter=15,
                                    embedding_dim=32, #Spotlight default is 32
                                    use_cuda=False,
                                    loss='regression',
                                    l2=0.00005,
                                    optimizer_func=optim.SGD(lr=0.001, momentum=0.9))

but it returns TypeError: init() missing 1 required positional argument: 'params' I know it might because I'm not passing self._net.parameters() to the optimizer. Could u suggest me how to do it?

TedSIWEILIU avatar Feb 19 '19 21:02 TedSIWEILIU

Yeah, you have to create an explicit function that receives parameters() as the first argument and returns an instantiated PyTorch optimizer object. This test here shows an example of how to use Adagrad instead of Adam.

    def adagrad_optimizer(model_params,
                          lr=1e-2,
                          weight_decay=1e-6):

        return torch.optim.Adagrad(model_params,
                                   lr=lr,
                                   weight_decay=weight_decay)

EthanRosenthal avatar Feb 19 '19 23:02 EthanRosenthal

Thanks, Ethan!

maciejkula avatar Feb 20 '19 02:02 maciejkula