Thatboy7
Thatboy7
> batchsize is 8  excuse me ! have you ever solved this problem?
> batchsize is 8  i meet the same question
> @ZhangJT0127 论文中注意力图的其中一个维度是H+W-1,这是因为计算了两次自身所以要减去,在代码中直接使用`INF`函数来生成负无穷并加在`energy_H`上,这样使用softmax时就消除了两次计算自身的影响 Is there an another method? the torch.diag methon isn't supported by onnx version and when i use torch.eye ,my tensorrt doesn‘t support it either.
> @ZhangJT0127 论文中注意力图的其中一个维度是H+W-1,这是因为计算了两次自身所以要减去,在代码中直接使用`INF`函数来生成负无穷并加在`energy_H`上,这样使用softmax时就消除了两次计算自身的影响 Is there an another method? the torch.diag methon isn't supported by onnx version and when i use torch.eye ,my tensorrt doesn‘t support it either.
> OK, thank you for your response,and i find someothers,like C3_n4 in backbone, cls_conv[2], obj_conv[2] and reg_preds[2] in head may make a seemingly correct result. in addition, I found that...