Kevin Musgrave

Results 174 comments of Kevin Musgrave

This is a pretty cool idea, so I tried implementing it. If anyone's interested, you can check it out [here](https://kevinmusgrave.github.io/pytorch-metric-learning/losses/#crossbatchmemory).

Not sure if related, but I was getting ```sbatch: error: Batch job submission failed: Invalid wckey specification```, and then I set wckey="" and it worked.

> This is technically a bug with `useSortable`, but indeed as mentioned by @lueenavarro, you can solve this by using `DragOverlay` for the time being until this is resolved @clauderic...

Access to the final coordinates of `active` would be really useful!

I've created this Google Colab notebook that you can run: https://colab.research.google.com/drive/109TaIdhLGu7LLFvm5gZ401tYYsKJrnNP?usp=sharing It's kind of like the existing [example notebooks](https://github.com/KevinMusgrave/pytorch-adapt/tree/main/examples), but with fewer helper functions. That is, there's more boilerplate, but...

I think it's because you haven't initialized the datasets. Change this: ```python src_dataset = MNIST target_dataset = SVHN ``` to this: ```python src_dataset = MNIST(root="mnist", download=True) target_dataset = SVHN(root="svhn", download=True)...

Good idea! I've updated the notebook to run on MNIST and SVHN: https://colab.research.google.com/drive/109TaIdhLGu7LLFvm5gZ401tYYsKJrnNP?usp=sharing You might want to start with a model that is already trained on the source dataset, instead...

You have to pass in a separate list of optimizers for the discriminator and generator steps. Every optimizer in each list gets called at each iteration. Here's how to use...

This notebook might also help: https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/getting_started/PaperImplementationsAsHooks.ipynb It shows how a bunch of different hooks are initialized.