MachineLearning
MachineLearning copied to clipboard
Add autoencoders
This could be done for EMNIST or something else.
Train an autoencoder (encoder + decoder) to learn an efficient encoding/features representation in an unsupervised manner, then throw away the decoder, fix the encoder weights, add some layers (perhaps only one would suffice, a softmax layer) and do supervised learning for those layers only (or maybe finetune the whole network) to classify digits/letters, perhaps using for training only a subset of the training data.
This would be a quite easy & nice example of unsupervised learning, encoder/decoder, transfer learning, unsupervised pretraining and some other buzzwords :)