Deep-Compression-PyTorch
Deep-Compression-PyTorch copied to clipboard
problem in dataloader
when the num_worker of Dataloader is not zero, there would be an error. some people say it is because the multi-threads of windows is accomplished with spawn instead of fork. I am not familiar with computer science, could you please tell me what should I do to avoid this error while still getting the benifits of multi-threads?
you need ste : num_worker = 0, you code could work.
you need ste : num_worker = 0, you code could work. it works!!!! thanks a lot. pruning.py : line 53, set num_workers:0