MLAlgorithms
MLAlgorithms copied to clipboard
Implement Gradient
I notice you use a 3rd party module to evaluate the gradient of your cost function in your GD routine. What was the reasoning behind this and why not implement it?
- There are tens of functions in the deep learning module. I prefer simplicity, so everyone can understand the concept behind this.
- Flexibility. People can play around with custom functions. No need to manually differentiate them.
But we can definitely get rid of this library in the linear models.
for completeness sake: the closed form derivative of mse (ie cost function used in linear regression) is fairly straightforward. the problem is, however, computing gradients for more complicated cost functions like the ones seen in neural networks.