proxynca_pp icon indicating copy to clipboard operation
proxynca_pp copied to clipboard

question: which part of your code takes care of proxy_lr vs base_lr?

Open monajalal opened this issue 3 years ago • 3 comments

Hello, In your paper you tune proxy_lr also as a hyperparameter. I was unsure where in your code you define proxy_lr could you please guide me? https://github.com/euwern/proxynca_pp/blob/master/train.py

monajalal avatar Sep 22 '21 23:09 monajalal

See lines 307 (base_lr) and 311 (proxy_lr). These values are set in the config file (e.g.: cub.json - lines 43 (proxy_lr) and line 46 (base_lr)).

euwern avatar Sep 23 '21 03:09 euwern

thanks for your response.

Did you mean that line 311 is the base_lr since it has the word base in it? Screen Shot 2021-09-23 at 9 00 52 PM

Screen Shot 2021-09-23 at 9 01 18 PM

and also, I am not sure where proxy_lr is used in the code. I found this but just said lr. Could you please point me to it?

Screen Shot 2021-09-23 at 9 02 14 PM

monajalal avatar Sep 24 '21 01:09 monajalal

It is used by the optimizer (train.py - line 290 ) "config['opt']['type']", which is defined in the config file (cub.json - line 33) "torch.optim.Adam". Do checkout the Pytorch documentation regarding on how to use an optimizer. The proxy layer has a different learning rate than the rest of the layers.

I hope it helps.

euwern avatar Sep 24 '21 02:09 euwern