Kevin Musgrave
Kevin Musgrave
This would allow users to limit the number of pairs/triplets returned by a miner. The ```triplets_per_anchor``` flag would be removed from TripletMarginLoss and MarginLoss. See #192 for related discussion.
BaseMetricLossFunction can split it into a tuple if its a tensor.
Currently there is the option to reduce the dimensionality of the embeddings using PCA. To allow more flexibility, this should be changed to a "dim reducer" input, which is expected...
[As mentioned here](https://github.com/KevinMusgrave/pytorch-metric-learning/issues/98#issuecomment-629239677) this is a possible improvement to make it easier for people to log the attributes in the "record_these" lists.
It would be nice to have a trainer for [MoCo](https://arxiv.org/abs/1911.05722). It would be similar to [UnsupervisedEmbeddingsUsingAugmentations](https://github.com/KevinMusgrave/pytorch-metric-learning/blob/master/src/pytorch_metric_learning/trainers/unsupervised_embeddings_using_augmentations.py) but would need to use [CrossBatchMemory](https://github.com/KevinMusgrave/pytorch-metric-learning/blob/master/src/pytorch_metric_learning/losses/cross_batch_memory.py) for the queue. Also, since the queue has...
Because that is the actual functionality of the flag.
Currently, the only practical way of using your own config files in addition to the default ones provided, is to download the config files into some folder, set --root_config_folder to...
It should be possible to extend the bayes opt bounds when it makes sense, specifically when the current best performing hyperparameters are at the very edge of the current bounds.
Specifically when trying to resume training.