torch
torch copied to clipboard
Parallel Dataloader failing when using num_workers > 0
Hi,
I am trying to increase the number of workers used by the dataloader but have been encountering issues. I saw issue 625 and 626 which included the warning message but cannot find an example vignette showing how to properly implement the parallel dataloader. Would it be possible to have a brief example for this?