MatchingNetworks
MatchingNetworks copied to clipboard
This repo provides pytorch code which replicates the results of the Matching Networks for One Shot Learning paper on the Omniglot and MiniImageNet dataset
In the source code, the author calculates the cosine distance as follows. sum_support = torch.sum(torch.pow(support_image, 2), 1) support_manitude = sum_support.clamp(eps, float("inf")).rsqrt() dot_product = input_image.unsqueeze(1).bmm(support_image.unsqueeze(2)).squeeze() cosine_similarity = dot_product * support_manitude *...
Thanks for your great work! I have a problem about how to run the code on the dataset miniImagenet. Does it mean that I should download full dataset "ILSVRC2012_img_train" and...
First thank you for your code. But I am puzzled about your “full context embeddings” implementation. In the paper, the process of g' and f' are different, but you just...
First very thanks for your implement of Matching-Networks with pytorch. I have follow your setup to run the miniImagenet example,the training accuracy can achieve about 100%,but the val and test...
The training objective in the original paper is theta. Should the loss value be theta? Why is loss in your program a cross entropy loss?
On [this line](https://github.com/gitabcworld/MatchingNetworks/blob/master/models/AttentionalClassify.py#L28) you're applying a softmax to the similarities. Then [later](https://github.com/gitabcworld/MatchingNetworks/blob/master/models/MatchingNetwork.py#L88) you apply `cross_entropy`, which is a log softmax + NLL loss. I think you probably want to remove...
Thank you for sharing your code. I have a question here about the Omniglot dataset test, where the support set is coded with a convolutional neural network, followed by the...
I have follow your setup to run the Omniglot example,test accuracy is about 99.6%.In origin paper it's about 98.9%.So I wonder if where I'm wrong to run your code or...