Neural-Network-p5 icon indicating copy to clipboard operation
Neural-Network-p5 copied to clipboard

Backquery - MNIST demo

Open shiffman opened this issue 8 years ago • 2 comments

I'd like to demonstrate a "back query" (feeding a desired output backwards through the network and getting pixels as a result) with this example. In @makeyourownneuralnetwork, the back query concept is explained and a sample image is provided (for the digit 0):

screen shot 2017-04-12 at 3 57 43 pm

Perhaps I'm too impatient but after running through 10,000 training images, my backquery for zero looks like:

screen shot 2017-04-12 at 4 07 10 pm

I think I'm missing something. Code in progress is in the backquery branch:

https://github.com/shiffman/Neural-Network-p5/blob/backquery/nn.js#L127

shiffman avatar Apr 12 '17 20:04 shiffman

I'll take a look at the code tomorrow (but I'm no javascript expert).

What seems immediately suspect is that the learning doesn't seem to have completed sufficiently .. over time the shape should become "smooth" in the different colours. The top image looks like it is a result of a Gaussian blur - the bottom one doesn't yet.

Having said that even 10,000 training examples should be sufficient to have formed a good shape so something is off. Does 10,000 training examples (with whatever other parameters e.g. number of nodes, learning rate, etc) give you an overall performance of around 95% for the moist 10,000 test set? If yes.. this is puzzling.

Here is someone else's animated visualisation of the network learning ... it gets smooth pretty quick.

https://www.youtube.com/watch?v=1vYaBCM2WOQ