pytorch_block_sparse
pytorch_block_sparse copied to clipboard
does this package allow multi-gpu training and distributed training?
If so, could you provide an example? Thanks.
Hi, I tried it with pytorch lightning using DDP and it works as expected