Lukas
Lukas
consider ``` signal = neural_network(weights) model = pyhf.Model({"signal": signal, ... }) logpdf = model.logpdf(pars,data) ``` this will allow you compute the gradient `dlogpdf / dweights`
@kratsg where we save is that jsonnet is a turing complete templating language we do not need to build. i.e. the complexity of the templates right now is low, but...
ttherre is also an arrangement like tthis that has tthe CLs and the qµ distrtibutions in normal orientation and the gaussian µ^ distribution rotated 
Added a second notebook on errors (the calc in #764 ) and we see compatible errors with various versions of how to calculate them. maybe a question to @alexander-held: is...
fwiw, this reproduces the errors in the 1Lbb lhood. (though the MINUIT specifics seem to not matter) 
the last entry (brown) is using minuit but supplying the AD gradient. This uses `strategy` one, so perhaps it's stoppinng eariler and therefore the underlying hesssian is different.
just making @kratsg @matthewfeickert aware.. not for review yet.. nice thing is we get easy visualization of the computational graph
interestingly some of the basic `tensorlib` tests fail ``` > assert np.std(values) < 1e-6 E assert 0.2802208533607308 < 1e-06 E + where 0.2802208533607308 = ([-16.948276294321396, -16.948274612426758, -16.94827651977539, -16.948274612426758, -17.648827643136507]) E...
``` E --------------------------------------------------------------------------- E Exception encountered at "In [2]": E File "", line 9 E **data, batch_size = 101**2 E ^ E SyntaxError: invalid syntax ```
do we want to differentiate between `experimental` and `contrib`?