Point-Transformers
Point-Transformers copied to clipboard
Point Transformer (Engel et al)
Hello,
thank you for your implementation. I'm the author of one of the point transformer methods. I'm wondering why your results differ that much from the reported results in our paper.
One thing that I immediately noticed, is that in your implementation you initialize one Sortnet and copy it M times https://github.com/qq456cvb/Point-Transformers/blob/master/models/Nico/model.py#L48. This also copies the weights. However, we employ M separate SortNets
Therefore,
self.sortnets = nn.ModuleList([SortNet(d_model, k=k)] * m)
should be replaced with
self.sortnets = nn.ModuleList([SortNet(d_model, k=k) for _ in range(m)])
I will keep investigating what else could cause the performance difference. Also, we plan on publishing the original code ourselves in the near future.
Again, thank you for your code contribution. Best regards, Nico Engel
Hi, thanks for your great work on transformers. Yes, it should be separate SortNet and I have updated the code. I will retrain the whole network in the following days.
@qq456cvb Any update about the performance after the change ?
@sidml Actually, after retraining with split SortNets, I got slightly worse performance than a single one. This is a bit strange, and currently I do not have any idea of the reason. Maybe it requires a different learning rate scheduling.
This is indeed strange. When I have the time I will have a look as well and compare it with the official implementation.
Could you please share the code of point cloud segmentation test based on point transformer?