nnet icon indicating copy to clipboard operation
nnet copied to clipboard

L2 regularization problem

Open heibanke opened this issue 8 years ago • 2 comments

hello, andersbll:

Thanks for your code. it is very useful for me. i read your code and want to ask a question.

Line68 in layers.py: self.dW = np.dot(self.last_input.T, output_grad)/n - self.weight_decay*self.W In L2 regularization, i think this program need modify into self.dW = np.dot(self.last_input.T, output_grad)/n + self.weight_decay*self.W Would you tell me what you think to use "- self.weight_decay*self.W"?

B.R heibanke

heibanke avatar Jun 15 '16 10:06 heibanke

another problem:

helpers.py:

def tanh_d(x):
    e = np.exp(2*x)
    return (e-1)/(e+1)

should modify into following code:

def tanh_d(x):
    e = tanh(x)
    return 1-e**2

B.R heibanke

heibanke avatar Jun 16 '16 09:06 heibanke

I was wondering about the minus-sign, too.

Also I am confused about the division by n, although it probably doesn't matter since it only changes the learning rate.

def tanh_d(x):
    e = np.exp(2*x)
    return (e-1)/(e+1)

seems to be the same as tanh, so I think you are right.

One of the reasons why tanh and sigmoid are used as an activation function is that the derivative can be computed from the forward propagation pass without evaluating an expensive function again, but that is not done here.

983 avatar Nov 20 '16 17:11 983