occupancy_networks icon indicating copy to clipboard operation
occupancy_networks copied to clipboard

The value of kl divergence is always 0

Open BJHYZJ opened this issue 2 years ago • 0 comments

After I finished the training of onet, in the evaluation result, the value of kl divergence is always 0. I think this is because in the function infer_z, self.encoder_latent is always null. Is this reasonable?

BJHYZJ avatar Jan 14 '23 09:01 BJHYZJ