Christopher Suter
Christopher Suter
The way this routine is implemented, it assumes that the input is upper triangular if `upper=True`. The implementation relies on there being zeros in the lower non-diagonal portion of the...
Maybe this? ```python import numpy as np import tensorflow as tf import tensorflow_probability as tfp a = np.arange(16).reshape([4, 4]) + 1 print(a) print(tf.linalg.band_part(a, -1, 0)) print(tfp.math.fill_triangular_inverse(tf.linalg.band_part(a, -1, 0))) print(tf.linalg.band_part(a, 0,...
I agree the documentation can improve. The key insight, I think, is that "fill_triangular" takes a vector of length `n * (n + 1) / 2` to a square matrix...
I think this is working as intended, in that the gradient w.r.t. discrete parameters/values is not defined. Is there a use case you're working with wherein one or both of...
Ah, looks like the issue is just that betainc doesn't have grads defined w.r.t. its first two arguments: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/math_grad.py#L922 I'm sure TF would welcome this upstream contribution, if you were...
I think we'd normally do this to work around the python function argument: ```python fn = tf.function(lambda: tfp.optimizer.lbfgs_minimize(quadratic_wrapper(minimum), ...)) ... fn() #
Can you plumb minimum through from the outer lambda? On Fri, Apr 22, 2022 at 13:58 Mohit Rajpal ***@***.***> wrote: > Hi csuter, > > Thanks for the quick reply....
Sorry I haven't had more time to engage on this. I don't think retracing is necessarily the issue though. If you put a print statement inside your optimize functions you...
Right -- stable (ie non-nightly) TFP releases are generally tied to a particular stable TF release and won't generally work with a subsequent TF release. TFP nightlies are tested against...
Thanks, Matthew. I recognize and appreciate that you're a long-time user and contributor. TF and TFP are maintained by quite separate groups; there is not very much explicit coordination, although...