Charles Gaydon
Charles Gaydon
With these modifications : - ModelNet : 85%-88% accuracy after the ~50th epoch, with only two DilatedResidualBlocks instead of 4. (target in [leaderbord](https://modelnet.cs.princeton.edu/): >=90%) - ShapeNet : Loss: 0.6704 Train...
I realized that the flow direction of the MessagePassing class responsible for the summarization of local neighborhood was inverted. I got immediate improvement for the segmentation task once fixed. ShapeNet...
https://github.com/pyg-team/pytorch_geometric/pull/5117/commits/1e544ca19300ac07d4b62713e132067e9cc82b25 : I also fixed the knn operation: first LFA aggregated only on neighborhood points, which means that the second LFA was working on a mixture of input features (for...
I think one last thing that I have to change is linked to the last upsampling :) In the paper, this diagram seems to mean that the output of the...
And voilà :100: ! We reach PointNet++-comparable performances on ShapeNet's Airplane class after a few epochs :) ShapeNet : Loss: 0.2036 **Train Acc: 92%, Test IoU: 82.7%** at epoch 30...
Thank you very much for sharing your implementation @saedrna. That was really helpful in order to give final touches of simplification, better readibility. I cherry-picked the following elements: - SharedMLP...
@saedrna As a side comment, if you still wand to use your implementation with a loop, I have the following comments: - Authors of RandLA-Net used `LeakyReLU(negative_slope=0.2)` instead of standard...
> Looks good to me Nice. :) > ~ Just one more thing: one of the beautiful things for RandLA is that, although it's called the selection of random decimation...
> I found an issue, the `momentum` parameter in [PyTorch](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm1d.html) and [Tensorflow](https://www.tensorflow.org/api_docs/python/tf/keras/layers/BatchNormalization) is different. So it should be 0.01 rather than 0.99. Indeed. Good catch, thank you, I'll update it...
@rusty1s You asked to be kept posted, so here is a gentle bump for review :).