pints
pints copied to clipboard
Eight schools notebook failing in daily test
Running examples/toy/distribution-eight-schools.ipynb ... ERROR
{'J': 8, 'y': [28, 8, -3, 7, -1, 1, 18, 12], 'sigma': [15, 10, 16, 11, 9, 11, 10, 18]}
Running...
Using Hamiltonian Monte Carlo
Generating 4 chains.
Running in sequential mode.
Iter. Eval. Accept. Accept. Accept. Accept. Time m:s
0 4 0 0 0 0 0:00.0
1 84 0.333 0.333 0.333 0.333 0:00.0
2 164 0.5 0.5 0.5 0.5 0:00.0
3 244 0.6 0.6 0.6 0.6 0:00.0
100 8004 0.98 0.941 0.98 0.98 0:01.4
200 16004 0.990099 0.926 0.990099 0.990099 0:02.7
Traceback (most recent call last):
File "<string>", line 112, in <module>
File "/home/runner/work/pints/pints/pints/_mcmc/__init__.py", line 668, in run
fxs = evaluator.evaluate(xs)
File "/home/runner/work/pints/pints/pints/_evaluation.py", line 106, in evaluate
return self._evaluate(positions)
File "/home/runner/work/pints/pints/pints/_evaluation.py", line 398, in _evaluate
scores[k] = self._function(x, *self._args)
File "/home/runner/work/pints/pints/pints/toy/_eight_schools.py", line 121, in evaluateS1
log_prior = pints.GaussianLogPrior(mu, tau)
File "/home/runner/work/pints/pints/pints/_log_priors.py", line 462, in __init__
raise ValueError(\'sd parameter must be positive\')
ValueError: sd parameter must be positive
Ha, I'm glad we recently changed pints.GaussianLogPrior(mu, tau) in #1256 -- before this was failing silently! I will take a look. Thanks.
This is actually an interesting one: more interesting than it might first appear! It also impinges on how we'd want hierarchical models to work.
In a non-hierarchical model, we'd probably want to raise an error if someone tried to create a Gaussian prior with a negative SD. I.e.:
pints.GaussianLogPrior(0, -1)
should raise an error (as it now does -- it didn't before update #1256).
In a hierarchical model, the parameters of that Gaussian prior, however, change as the sampler runs. To code up the 8 schools model -- a hierarchical model -- in PINTS, we then dynamically create a GaussianLogPrior
when we call the overall model. This means that we do something like:
def __call__(self, x):
log_prior = GaussianLogPrior(x[0],x[1])
....
In this case, we want log_prior to still instantiate if a negative sd is proposed but return -np.inf if it's called then the sampler would naturally reject that proposal.
I think this'll probably all be handled once we have proper hierarchical priors in PINTS, but I was wondering what you all thought was a reasonable resolution to the above? I've got the option of allowing instantiation with a negative sd (as we had before) but returning -np.inf if it's then called; or we can do error handling in 8 schools and keep what we currently have; another option is to raise a warning if instantiated with a negative sd but still allow it...
@MichaelClerx @martinjrobins @DavAug Thoughts?
@ben18785 I don't think we should allow a value that doesn't make sense, this would be convenient in this case but silently break other cases.
So we need to work it into the 8 schools thing. You'd want to think about this anyway for performance. No sense in running the "schools" that don't accept any proposal.
Going to add a return -log infinity if sd < 0 in the eight schools model only.
Fixed in #1347