FastFlow
FastFlow copied to clipboard
why loss is negtive?

The loss is therotically negative log-likelihood, and the likelihood is computed as {1/sqrt(2pi)}^n * exp(-z^Tz / 2) * Jac. => The negative log-likelihood: n/2 * log(2pi) + z^Tz / 2 - logJac
- The likelihood can be itself over 1 anyway.
- Because n/2 * log(2pi) aren't computed for the actual computing. (It is constant). The loss is lower than the real negative log-likelihood. Refer to loss in fastflow.py.