mx-lsoftmax
mx-lsoftmax copied to clipboard
Why the directions of digits in visualization are exactly the same?
Not only in your ReadMe, but also in my training results. The directions of each digits in the visualization are exactly same. I think the directions should depend on the random initialization to some extent. So is there are something in your code which caused that?
Thx~
hallo? Really wondering...
oh ,got it. mxnet is different from numpy in function random. numpy use different random seed durning each run however mxnet use same seed....
but still interesting, since during each train ,it use different m, this constraint won't change the final direction however
margin won't change the direction. I think the weight initialization makes this.