MLAlgorithms icon indicating copy to clipboard operation
MLAlgorithms copied to clipboard

Implement Gradient

Open eggie5 opened this issue 9 years ago • 2 comments

I notice you use a 3rd party module to evaluate the gradient of your cost function in your GD routine. What was the reasoning behind this and why not implement it?

eggie5 avatar Nov 20 '16 00:11 eggie5

  1. There are tens of functions in the deep learning module. I prefer simplicity, so everyone can understand the concept behind this.
  2. Flexibility. People can play around with custom functions. No need to manually differentiate them.

But we can definitely get rid of this library in the linear models.

rushter avatar Nov 20 '16 01:11 rushter

for completeness sake: the closed form derivative of mse (ie cost function used in linear regression) is fairly straightforward. the problem is, however, computing gradients for more complicated cost functions like the ones seen in neural networks.

mynameisvinn avatar Dec 27 '17 21:12 mynameisvinn