Justus Schock

Results 90 comments of Justus Schock

@skshetry I see, that's a good point, thanks. Would scanning for files in advance slow things down? I guess it wouldn't (theoretically when the files are known, the hashing could...

+1000 for this :D I think if there are specifications where this doesn't work with the cluster env, we should rather think about implementing these missing cluster envs :)

ddp-spawn works virtually everywhere (except for interactive envs), right? Is there any case where subprocess ddp doesn't work? Because IIRC we chose spawn due to it's compatibility with almost every...

@pruthvistony It cannot be merged before we are able to verify PL with AMD GPUs (see https://github.com/Lightning-AI/lightning/issues/13609#issuecomment-1204851165 for reference).

@carmocca during the first accelerator refactor we have introduced the term `mixed` (before it was 16 always) because we were actually planning to deprecate the alias of 16 for AMP...

That's not yet decided. For a true half precision training the current way to go would be to use a custom `PrecisionPlugin` similar to what we do for [double precision](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/plugins/precision/double.py)

@awaelchli I like that approach! However, this implies that when having `precision="mixed"` it will always use native amp, right? What do we do if new precisions come up? Do we...

@carmocca do we really want to append "-true" to all non-mixed precision stuff? intuitively, when I see precision="16", I would assume that this is always the case and there would...

I got a similar issue without tensorboardX and `torch.nn.Parameter`. I simply use a `torch.Tensor` (dtype=float64) and try to set some values in it. I even tried to use the `scatter_`...

Question : why do we need this? Can't we wrap all sources by a single dataset, that returns a dict?