DIM
DIM copied to clipboard
How to understand mutual information in deterministic deep networks?
Dear Professor R Devon Hjelm,
Is the mutual information between inputs and global representations constant, since the network is deterministic? If so, how to understand the mutual information used in the loss function?
Thanks