Pavel Sountsov

Results 20 comments of Pavel Sountsov

How did you get the original error? RunningVariance is expected to have event_ndims=0, because it treats each element as an independent (co)variance estimation problem.

As a workaround, I think you can add this method to your class: ```python def __new__(cls, *args, **kwargs): return tfd.Distribution.__new__(cls) ```

There's no easy way to do it at the moment. The following enhancements are necessary: - The two `bijector_fn` functions ([1](https://github.com/tensorflow/probability/blob/68f626fde3ee8b58c1aa0113c3dd86ced894e05e/tensorflow_probability/python/bijectors/glow.py#L563) and [2](https://github.com/tensorflow/probability/blob/68f626fde3ee8b58c1aa0113c3dd86ced894e05e/tensorflow_probability/python/bijectors/glow.py#L659)) need to accept and pass along conditioning...

It's probably feasible, but not on our roadmap. You'd essentially replicate how the numpy substrate is implemented: add a new directory in `tfp/python/internal/backends` and then add a bunch of bazel...

This implementation is not correct. The mean of a categorical is defined as `sum(i * prob(x=i) for i in range(num_categories)`. You should be able to modify the implementation of the...

Which other bijectors did you try? I tried to reproduce the original issue, and at a first glance, the issue was that GLOW just has a gigantic graph with its...

Could you comment on how your version is different than TFPs (on a computational level, it's certainly might arrive at the same result in a different way). In particular, we...

I think you'd need to re-run the colab and then download and add the altered colab file into your pull request.

You'd have to write your own wrapper distribution. [`tfd.Masked`](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/distributions/masked.py) is similar in principle, but only supports 0-1 weights. It could be an inspiration, although really all you need to do...

This is going to be tricky to fix since the algorithm uses dynamic shape within a while loop, something that JAX jit doesn't like. We could either fix the algorithm...