ENAS-pytorch icon indicating copy to clipboard operation
ENAS-pytorch copied to clipboard

gpu_nums> 1

Open lianqing11 opened this issue 6 years ago • 0 comments

If want to run on multi gpu, when self.shared is forwarding , should use Modulelist's data (like self._w_h(which is a type of ModuleList)). Otherwise will raise an error :( RuntimeError: tensors are on different GPUs) , beacuse when self.forward(xx), the parameter are used stored in list data structure, and would not replicate to another gpu.

lianqing11 avatar Apr 12 '18 06:04 lianqing11