Question about grad_out of roi_pooling
Thanks for your code. I have a question, why you add
grad_out = grad_out.contiguous() if not grad_out.is_contiguous() else grad_out in the roi_pool.py L32? Thanks.
In order to avoid uncontiguous problem(most operation in pytorch is contiguous). You can find there is a contiguous check in the c++ code, so in order to "remove" the uncontiguous error. (Use the tensor.contiguous() function. If tensor non-contiguous, it’ll return a contiguous copy. If it’s already contiguous, it’ll return the original tensor. --- I did not test whether delete this line can work as well ?)
Thanks. I think it is essential. In my project, without this line, the grad may be wrong.