Tongzhou Wang

Results 15 comments of Tongzhou Wang

In a sense this is due to that OmegaConf infers types from **values** rather than type annotations. I wonder if there can be a way to use type annotations instead...

the entire testing code is here https://github.com/SsnL/dataset-distillation/blob/master/main.py#L244-L356

Conceptually a couple strategies can be used: 1. distributed different steps to different GPUs 2. use gradient checkpointing to recompute early steps' graphs rather than storing them. Neither is directly...

The images may be optimized to jointly give some gradient. If you want to be order/batch agnostic, you can try modifying the distillation procedure to apply the images in randomly...

You expressed well and I understood exactly what you meant. What I was saying is that if you want the images to be able to be applied in a certain...

@bhat-prashant if you are interested in the trained distilled dataset. You may obtain one using the code in this repo.

cc @ngimel On Sat, Jul 7, 2018 at 06:56 Jerry Ma wrote: > Repro > > - Apply #381 on this > repo > - cd to the ImageNet folder...

Also we should update the doc on `no_grad` before next release.