RoIAlign.pytorch icon indicating copy to clipboard operation
RoIAlign.pytorch copied to clipboard

Hi does your roi align layer support multi gpu?

Open mks0601 opened this issue 6 years ago • 6 comments

Hi, Does your roi align layer support multi gpu? I got the weird result when RoI align layer from yours.. For example, if I use 2 gpu, the number of batch of the output tensor is 2 times less.

Can you check?

mks0601 avatar Mar 05 '18 14:03 mks0601

This may be caused by the wrong box_ind. Your input boxes will be split into two tensors when you are using 2 GPUs while the box_ind in the second tensor will not change automatically.

longcw avatar Mar 19 '18 02:03 longcw

Then how can I make box_ind correct?

mks0601 avatar Mar 19 '18 03:03 mks0601

You can make the values in box_ind corresponding to each GPU when you generate roi.

longcw avatar Mar 19 '18 03:03 longcw

@longcw it didn't work for me

boxes_index = boxes_index.cuda(input.data.get_device())

Still it fails saying tensors are in different GPUs. It only works when using the GPU with index 0. So it seems that for some reason I don't understand somewhere the GPU is hardcoded. I've been digging through the code but I couldn't find it unfortunately...

paucarre avatar Mar 22 '18 16:03 paucarre

@paucarre Hi, did you make it work with multiple GPUs?

stomachacheGE avatar Jul 10 '19 02:07 stomachacheGE

@stomachacheGE @paucarre I found this cmd works for me: roi= roi.to(input.device) The idea is that both feature and roi should be on the same device.

frostinassiky avatar Oct 30 '19 18:10 frostinassiky