pytorch_tabular
pytorch_tabular copied to clipboard
Unable to use DDP
I wasn't able to find propagation of PyTorch Lightning Trainer accelerator this results in ddp_spawn selection
UserWarning: You requested multiple GPUs but did not specify a backend, e.g. Trainer(accelerator="dp"|"ddp"|"ddp2"). Setting accelerator="ddp_spawn" for you.
Only some of the parameters from Trainer were propagated to PyTorch Lightning. I'm changing that in the next release. Latest code in GitHub already allows for this option using trainer_kwargs in TrainerConfig to specify parctically any parameter in the PyTorch Lightning Trainer