Switchable-Whitening icon indicating copy to clipboard operation
Switchable-Whitening copied to clipboard

How to early stop the update of branch weights as the paper said?

Open JiyueWang opened this issue 4 years ago • 3 comments

JiyueWang avatar Apr 18 '20 01:04 JiyueWang

You can split the parameters into 2 groups in the optimizer as below: issue1 issue2 Then you can set the learning rate of SW to zero in the middle of the training.

XingangPan avatar Apr 20 '20 15:04 XingangPan

Can you show some results about how serious the overfitting problem is?

JiyueWang avatar Apr 22 '20 07:04 JiyueWang

@JiyueWang The initial version of SW is implemented with SVD decomposition, where the lack of early stop would lead to a decrease of about 0.5 scores. The current version of SW uses newton's iteration, which naturally has stochasticity, so the overfitting is alleviated. We observe that the effect of early stop is negligible in this case.

XingangPan avatar May 01 '20 15:05 XingangPan