MEAN icon indicating copy to clipboard operation
MEAN copied to clipboard

questions regarding coord2radial

Open pengzhangzhi opened this issue 2 years ago • 1 comments

Hi. I find the way you calculate radial is different from other similar works, e.g., EGNN. Your strategy. the radial is the dot product of the coord differences.


def coord2radial(edge_index, coord):
    row, col = edge_index
    coord_diff = coord[row] - coord[col]  # [n_edge, n_channel, d]
    radial = torch.bmm(coord_diff, coord_diff.transpose(-1, -2))  # [n_edge, n_channel, n_channel]
    # normalize radial
    radial = F.normalize(radial, dim=0)  # [n_edge, n_channel, n_channel]
    return radial, coord_diff

EGNN's strategy. the radial is the squared distance between two nodes.

    def coord2radial(self, edge_index, coord):
        row, col = edge_index
        coord_diff = coord[row] - coord[col]
        radial = torch.sum(coord_diff**2, 1).unsqueeze(1)

        if self.normalize:
            norm = torch.sqrt(radial).detach() + self.epsilon
            coord_diff = coord_diff / norm

        return radial, coord_diff

I think your radial can represent the orientation of two multi-channel residues, and egnn's radial represents the distance. Is this reasonable? What do you think it represents? What's your motivation for defining it this way instead of following egnn? The way you normalize the radial is quite interesting, you normalize it along the n_edge dimension (similar to "batch dimension"). Why? Have you tried removing normalization?

Best, Zhangzhi

pengzhangzhi avatar Feb 12 '23 13:02 pengzhangzhi

Thanks for the insightful questions.

  1. Actually we adopt the inner-product radial due to its representation completeness. EGNN is designed for single-channel nodes, and we do not know whether the norm-based radial can fit arbitrary orthogonality-equivariant functions with multi-channel nodes. On the contrary, the representation completeness of the inner-product radial is explored already in "Equivariant graph mechanics networks with constraints" (Section 3.2).
  2. As for the normalization, we found it is necessary to maintain numerical stability. If the normalization is removed, the training procedure will be unstable and easily produce NaN in the loss. But we didn't quite explore the benefit of different form of normalization. I have just tried the normalization strategy in EGNN, and the performance on CDR design did not exhibit significant change, so maybe either is OK.

kxz18 avatar Feb 16 '23 11:02 kxz18