GAPointNet
GAPointNet copied to clipboard
MLP and CNNs
Hello!
Sorry to bother you. I have worked with your code for a couple of days now. I am very excited and I found a couple of things that might be relevant.
Here is another one. In the paper you mention Multi-Layer Perceptrons as the basis for the transformer/attention-layers. So far so good. In the implementation, there are a lot of 2D convolutions with kernel-size (1, 1). I think this turns the convolutions effectively into plain dense-layers. Example: https://github.com/FrankCAN/GAPointNet/blob/master/models/network.py#L64
What do you think?