neon icon indicating copy to clipboard operation
neon copied to clipboard

Best way to add L2 norm to the cost?

Open buriy opened this issue 8 years ago • 3 comments

Hi devs,

what is the right way on adding squared sum of values of some input layers to the cost? Please give me a hint, and I'll try to make an implementation.

buriy avatar Dec 12 '15 20:12 buriy

Could you give me some additional details of what you are trying to do?

By input layers, do you mean the input to the network? Or do you just mean layers that are not the final cost. And by values, do you mean the activations? Or the parameters (weights) themselves?

I imagine you are trying to do some kind of regularization by penalizing the magnitude of the weights -- is that correct?

apark263 avatar Jan 14 '16 19:01 apark263

Yes, I'm trying to do L2 penalization. By "input" i mean hidden / non-final layers. By values I mean weights themselves. Something like http://lasagne.readthedocs.org/en/latest/modules/regularization.html Ideally if I could sum up several different loss functions for same or different layers and they would work automagically.

Also I'd like to have weight decay and learning rate change schedules not tied to GradientDescentMomentumWeightDecay , I guess that's in progress for v 1.2 .

buriy avatar Jan 14 '16 20:01 buriy

I want to do regularization by penalizing the magnitude of the weights too.

But I didn't find anyway to that in Neon.

Has this issue be fixed?

mw66 avatar Apr 17 '17 23:04 mw66