Getting rid of `posterior_nn`, `likelihood_nn`,...
Currently, our networks are built like this:
density_estimator_builder = posterior_nn("maf")
inference = NPE(density_estimator=density_estimator_builder)
I think this is very suboptimal, because posterior_nn has very little flexibility. For example, it is unclear to me how one would use it to modify the neural networks used by MNPE (see here).
How to fix this? We actually do have another level of entry which is much more flexible:
density_estimator = build_maf(theta, x)
However, this density_estimator cannot currently be passed to NPE. Instead, it can currently only be used with the "Training interface".
I suggest that---maybe only for v1.0---we deprecate posterior_nn and instead require that NPE(density_estimator) is a DensityEstimator, not a Callable that builds a DensityEstimator. Alternatively, we could allow both (Union[Callable, DensityEstimator ]) .
I agree, especially the variable naming is quite confusing here and we should add support for passing a density_estimator as well.
However, one reason for passing the builder function is that we are adding the z-scoring transformations at run time using the training data. We would give up control over that if we allow direct passing of density estimators.