learn2learn
learn2learn copied to clipboard
A PyTorch Library for Meta-learning Research
Hi there, why this examples is not working? why cross_entropy fails? ```python import learn2learn as l2l import pytorch_lightning as pl tasksets = l2l.vision.benchmarks.get_tasksets('omniglot', root="D:/datasets/omniglot/") features = l2l.vision.models.OmniglotCNN() protonet = LightningPrototypicalNetworks(features)...
Added meta_lr_final parameter to indicate the final value of the meta learning rate. By setting meta_lr_final to a smaller value than meta_lr, the meta learning rate is effectively decayed during...
In the following reptile example script https://github.com/learnables/learn2learn/blob/master/examples/vision/reptile_miniimagenet.py:111, the meta-learning rate has a bug: it remains constant and does not decay. ```new_lr = frac_done * meta_lr + (1 - frac_done) *...
### Description Learning Per-Layer Per-Step Learning Rates (LSLR, MAML++). These GBML transforms can be used to reproduce this MAML++ functionality for MAML, MetaSGD and other algorithms which can be reproduced...
I got this bizarre error: ``` File "/home/miranda9/miniconda3/envs/metalearning_gpu/lib/python3.9/site-packages/learn2learn/algorithms/maml.py", line 169, in adapt self.module = maml_update(self.module, self.lr, gradients) UnboundLocalError: local variable 'gradients' referenced before assignment ``` seems weird since it's coming...
Hi, I was wondering if checkpoints could be made available for some of the meta-learning datasets. I am in particular interested in mini-imagenet checkpoints trained with MAML. Though, honestly, the...
I want a few-shot learning data set that works similar to meta-data set (as a first step to reach that) i.e. sample a data set first then create a n-way,...
Hi, I wanted to see the actual image classes (ideally as strings) e.g. cat dog etc for mini-imagenet tasks. Is it possible to do it? How does one do it?
With PyTorch's [latest update](https://pytorch.org/blog/pytorch-1.11-released/), the introduction of the functorch library allows, via composable function transforms, to ["efficiently batching together tasks in the inner-loop of MAML"](https://pytorch.org/functorch/stable/notebooks/whirlwind_tour.html#vmap). There seems to be a...
### Description The MAML Toy example `examples/maml_toy.py` does not run due to a sigma parameter being sampled from a Normal distribution. That sigma is used to parameterise another Normal distribution,...