jsNet
jsNet copied to clipboard
Webassembly 3.4.1 version on npm doesn't randomize weights
Output of npm list
:
-- [email protected]
Weights in the network [2, 3, 1] (XOR example) at second layer stay the same:
[ { bias: 1, weights: [ -0.4384047377812256, 0.71476953983941 ] },
{ bias: 1, weights: [ 0.836493930022522, 0.7845843352178457 ] },
{ bias: 1, weights: [ -0.8790343793408859, -1.049725288141738 ] } ]
This produces the same results in 5 (or any number of) epochs training.
Interestingly, the javascript version does not have this problem.
Sorry for the delayed reply, I just tried this out, and I wasn't actually able to replicate it.
What I did was npm installed v3.4.1, navigated to node_modules/jsnet/nodejsDemo.js
, and ran it, printing out the weights in layer [1] in a callback as console.log(net.layers[1].neurons.map(n => n.weights))
.
Is this the same as what you did?
If this is still an issue, and you made any changes, could you perhaps share what they were? If there were no other changes, then this is somewhat strange, and will need further exploring.
Did the inference look plausible, at the end of the training, for you?
I was initially converting the whole network to json and then printing the weights, but even your way kept them the same.
I had tested it only in Windows, but I just tried it out in a fresh VM with Ubuntu and got the same results.
It is very weird! In the nodejsDemo.js there are a js and wasm version, kinda tangled. Did you isolate the wasm version?
My best prediction is that the random values stuck in the model after compilation, maybe?
I'm using Windows, also, so that's probably not it. And both versions are loaded, and executed after each other, to show speed comparison, but are not otherwise interlinked.
Are the values at the end correct-ish for XOR? For example:
If they are, then the network trained, so the values should have changed, but if not, then, as though learning rate was 0, the weight values for some reason don't get updated.
Are you using node at least v8?