Deep-Compression-PyTorch icon indicating copy to clipboard operation
Deep-Compression-PyTorch copied to clipboard

problem in dataloader

Open xupengzheng opened this issue 4 years ago • 2 comments

when the num_worker of Dataloader is not zero, there would be an error. some people say it is because the multi-threads of windows is accomplished with spawn instead of fork. I am not familiar with computer science, could you please tell me what should I do to avoid this error while still getting the benifits of multi-threads?

xupengzheng avatar Sep 26 '20 05:09 xupengzheng

you need ste : num_worker = 0, you code could work.

YvonneDL avatar Jun 24 '22 02:06 YvonneDL

you need ste : num_worker = 0, you code could work. it works!!!! thanks a lot. pruning.py : line 53, set num_workers:0

QuantumLuckin avatar May 04 '23 18:05 QuantumLuckin