VoxSeT icon indicating copy to clipboard operation
VoxSeT copied to clipboard

What is the meaning of the 'latent codes'?

Open Estrellama opened this issue 2 years ago • 0 comments

1. What is the meaning of the 'latent codes'? There is no detailed introduction to its origin and meaning in the paper.

I try to find it in the code. In pcdet/models/backbones_3d/vfe/voxelset.py, I found that the latent code was omitted with the nn.linear() layer? Why is this possible? The related code show as below.

    # the learnable latent codes can be obsorbed by the linear projection
    self.score = nn.Linear(dim, n_latents)
    def forward(self, inp, inverse, coords, bev_shape):
          x = self.pre_mlp(inp)

          # encoder
          attn = torch_scatter.scatter_softmax(self.score(x), inverse, dim=0)
          dot = (attn[:, :, None] * x.view(-1, 1, self.dim)).view(-1, self.dim*self.k)
          x_ = torch_scatter.scatter_sum(dot, inverse, dim=0)

2. The ensuing question, since the latent code is omitted, how to visualize the latent code on page 8 of the paper?

This is really bothering me for a long time, sincerely hope the author guides.

Estrellama avatar Jul 23 '22 13:07 Estrellama