sbi
sbi copied to clipboard
add example / how-to-guide for hyperparameter tuning
It would be nice to have a short guide on how to tune hyperparameters, e.g., neural architectures in sbi. E.g., by using a test set of ~1000 (theta, x)s and using the negative log prob under the posterior as a metric for comparison architectures and using grid search etc via MLFlow or WandB to find a suitable architecture.