RoIAlign-RoIPool-pytorch icon indicating copy to clipboard operation
RoIAlign-RoIPool-pytorch copied to clipboard

Question about grad_out of roi_pooling

Open GriffinLiang opened this issue 7 years ago • 2 comments

Thanks for your code. I have a question, why you add grad_out = grad_out.contiguous() if not grad_out.is_contiguous() else grad_out in the roi_pool.py L32? Thanks.

GriffinLiang avatar May 09 '18 01:05 GriffinLiang

In order to avoid uncontiguous problem(most operation in pytorch is contiguous). You can find there is a contiguous check in the c++ code, so in order to "remove" the uncontiguous error. (Use the tensor.contiguous() function. If tensor non-contiguous, it’ll return a contiguous copy. If it’s already contiguous, it’ll return the original tensor. --- I did not test whether delete this line can work as well ?)

AceCoooool avatar May 09 '18 02:05 AceCoooool

Thanks. I think it is essential. In my project, without this line, the grad may be wrong.

GriffinLiang avatar May 09 '18 02:05 GriffinLiang