Snowkylin Lazarus

Results 23 comments of Snowkylin Lazarus

Generally speaking, MANN is not so easy to get converged as other RNN models are, and a blind combination can result in severe instability of training. I take a lot...

That is only part of the data set, another part is [images_evaluation.zip](https://github.com/brendenlake/omniglot/blob/master/python/images_evaluation.zip) which contains 679 classes. You can add the two part to get the whole data set and divide...

For TensorFlow 2, the NTM/MANN model is implemented in `ntm/ntm_cell_v2.py` and `ntm/mann_cell_v2.py`. You can train the model by running `copy_task_v2.py`. Sorry that no saving/testing functionality is implemented in the v2...

Well, it will be weird if you want to do inference on a single image using this model. The sequence is a "learning process" for meta-learning which is required in...

Yes, you need to send at least 1 image of each class to the sequence (you can call it "train phase") so the model can have a chance to know...

No need for 50 sequences. After training the network, the model should have ability to classify latter part of images in a sequence based on preceding part of the sequence....

It seems that you have already got the idea. In table 1 of MANN paper, the "1st" - "10th" means "if the image in the same class are shown k-th...

If you just want to replicate the result in page 11 of my slide, you can set parameters smaller. `memory_size` and `memory_vector_dim` can be set to 20 and 8.

Please create directory `./save/copy_task/NTM` so that checkpoint files have place to be saved. Sorry that I do not mention it in the readme file.

I am not very sure about this, but generally NTM is harder to converge compared with other DL models, which is a main drawback of this model.