Rosario Scalise

Results 32 comments of Rosario Scalise

If this isn't likely to be supported anytime soon, does anyone have an appropriate work around? I'm also stuck on this.

Nice! Thanks @graingert! It looks like @adamchainz (OP) got tired of the problem and created this shortly after filing the issue 😆 . For the record, I ended up adding...

Hi @ashleve this is all very relevant to the efforts @omry and I have been making. We would love to have your input and help! Easiest way to chat about...

Sorry about that my fellow rosario! Hahaha

Before merging, I just want to point out that because of the way pytorch structures its files (and `__init__.py` for certain modules), the optimizer register can be done like this:...

@briankosw If you have a draft PR, I'm happy to check it out! =] Doesn't need to be finished.

> > High level feedback: > > We will probably have multiple examples for distributed data parallel, with different limitations and advantages. > > It's good to group them together...

Don't the users have control by determining which 'module' they import? What was your alternative proposition?

Ok, I think this is probably a better option. The automatic strategy would likely leave users confused about what _is_ and what _isnt_ currently registered. Better to be explicit and...

I think there are at least 3 meaningful examples to work on: 1.) Barebones, configuring just one thing (like the optimizer) 2.) Full MNIST configuration. 3.) Training framework like Lightning...