Point-Transformers icon indicating copy to clipboard operation
Point-Transformers copied to clipboard

Point Transformer (Engel et al)

Open engelnico opened this issue 3 years ago • 5 comments

Hello,

thank you for your implementation. I'm the author of one of the point transformer methods. I'm wondering why your results differ that much from the reported results in our paper.

One thing that I immediately noticed, is that in your implementation you initialize one Sortnet and copy it M times https://github.com/qq456cvb/Point-Transformers/blob/master/models/Nico/model.py#L48. This also copies the weights. However, we employ M separate SortNets

Therefore,

self.sortnets = nn.ModuleList([SortNet(d_model, k=k)] * m)

should be replaced with

self.sortnets = nn.ModuleList([SortNet(d_model, k=k) for _ in range(m)])

I will keep investigating what else could cause the performance difference. Also, we plan on publishing the original code ourselves in the near future.

Again, thank you for your code contribution. Best regards, Nico Engel

engelnico avatar Aug 27 '21 14:08 engelnico

Hi, thanks for your great work on transformers. Yes, it should be separate SortNet and I have updated the code. I will retrain the whole network in the following days.

qq456cvb avatar Aug 28 '21 13:08 qq456cvb

@qq456cvb Any update about the performance after the change ?

sidml avatar Oct 26 '21 04:10 sidml

@sidml Actually, after retraining with split SortNets, I got slightly worse performance than a single one. This is a bit strange, and currently I do not have any idea of the reason. Maybe it requires a different learning rate scheduling.

qq456cvb avatar Oct 29 '21 02:10 qq456cvb

This is indeed strange. When I have the time I will have a look as well and compare it with the official implementation.

engelnico avatar Oct 29 '21 06:10 engelnico

Could you please share the code of point cloud segmentation test based on point transformer?

tfQi01 avatar Apr 17 '23 12:04 tfQi01