MachineLearning
MachineLearning copied to clipboard
From linear regression towards neural networks...
As specified in README, there are some dubious design issues (there are some 'historical' motivations, but it doesn't matter). The code should be refactored to be more intuitive. This is...
For now only the most important ones are implemented: identity, sigmoid and relu. Some more could be added, such as hyperbolic tangent, leaky relu, sofplus and so on... Postponed until...
Something like deep q-learning. Perhaps an example related with physics. Pole balancing is a typical subject.
For now only L2, L1 and binary cross-entropy losses are implemented. Should be enough for simple examples/cases... but more could be added (as in Huber and so on). This is...
For the multilayer perceptron having it wasn't needed, but if I'll want to go much further with this, it's worth having it implemented. It's not such a big deal to...
It's useful in many circumstances, for example for https://github.com/aromanro/MachineLearning/issues/21
This could be done for EMNIST or something else. Train an autoencoder (encoder + decoder) to learn an efficient encoding/features representation in an unsupervised manner, then throw away the decoder,...