MT-net
MT-net copied to clipboard
Code accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"
gumbel_hard = tf.cast(tf.equal(gumbel_softmax, tf.reduce_max(gumbel_softmax, 1, keep_dims=True)), tf.float32) mask = tf.stop_gradient(gumbel_hard - gumbel_softmax) + gumbel_softmax For the above code, it seems to choose the maximal index marked with "1" and all...
Hello, I tried to reproduce the results of T-net on 5w1s miniimagenet experiment. I set parameters as --metatrain_iterations=60000 --meta_batch_size=4 --update_batch_size=1 --num_updates=5 --logdir=logs/miniimagenet5way --update_lr=.01 --meta_lr=0.001 --resume=True --num_filters=32 --max_pool=True --use_T=True But I...
Dear Authors, In the code: sampled_character_folders = random.sample(folders, self.num_classes) random.shuffle(sampled_character_folders) labels_and_images = get_images(sampled_character_folders, range(self.num_classes), nb_samples=self.num_samples_per_class, shuffle=False) One of random labels (1 to 5 class) are assigned to each folder of...
It's annoying but print needs parenthesis in Python 3; you have a few without the parenthesis. In MAML.py, you have for k, v in weights.iteritems(): You should update it to...
Hello, I'm trying to reproduce the result reported in your paper. However, by running the default script for omniglot 20way 1shot, ` python main.py --datasource=omniglot --metatrain_iterations=40000 --meta_batch_size=16 --update_batch_size=1 --num_classes=20 --num_updates=1...