Michael Truell

Results 216 comments of Michael Truell

Still have to do a storage test for Backpropagation though

Thank you for the feedback! @jpetso

I do agree that the initializers do need to be more self documenting, but is the builder design pattern common in C++? I have mostly seen it used with Java.

@jpetso. Yes, but if we went with the "making everything a setter design scheme," I believe that all current models would be valid with preset constants, which removes the need...

@joshuagruenstein @FlyingGraysons. A getter and setter design scheme would look like this: ``` cpp net::NeuralNet neuralNetwork; neuralNetwork.setNumInputs(1); neuralNetwork.setNumOutputs(1); neuralNetwork.setNumHiddenLayers(2); neuralNetwork.setNumNeuronsPerHiddenLayer(4); neuralNetwork.setActivationFunction("sigmoid"); ```

Or we could have each setter return the object, allowing for one liners.

@FlyingGraysons Do you need any help?

Adadelta done

Its super easy to add new trainers now. Checkout the backpropagation source: https://github.com/FidoProject/Fido/blob/master/src/Backpropagation.cpp. The main method to override is just getChangeInWeight();

Details the math of the trainers: http://caffe.berkeleyvision.org/tutorial/solver.html.