Fritz Obermeyer

Results 210 comments of Fritz Obermeyer

See also [this forum discussion](https://forum.pyro.ai/t/how-to-render-a-semi-supervised-model/4751/8)

I believe there has been no progress, but @eb8680 was looking into it around 6 months ago.

@lezcano thanks for doing this. Yes it seems fine to replace `log(1+x)` with `log1p(x)` in distributions.

@min-jean-cho > why does torch.Tensor.exponential_ check `lambda >= 0` ? My guess is somebody thought the `lambd` meant the _mean_ which is a common parameterization of exponential distributions, but happens...

@min-jean-cho I do think it makes sense to allow `rate = 0` for `torch.distributions.Poisson` and `torch.poisson`. For Poisson random variables the rate parameter is the mean, and the distribution converges...

Another approach might be @martinjankowiak's [GaussianScaleMixture](https://docs.pyro.ai/en/stable/distributions.html#pyro.distributions.GaussianScaleMixture) distribution in Pyro, but it may be more limited than TFP's implicit reparametrization

It would be great if this were controllable, or if EXISTS_ON_INIT were a separate event type users like @frank-lenormand could filter out. OTOH it's helpful in some applications to get...

Is it feasible to at least raise an error when summing over one axis? I just spent a long time debugging, and enventually found that `my_2d_array.sum(0)` was being silently interpreted...

Re: combinatorial explosion (@pitrou) I often use the `keepdims` argument, as in `my_array.sum(axis=n, keepdims=True)`. I believe the three important cases are `.sum()`, `.sum(axis=n)`, `.sum(axis=n, keepdims=True)`, and that there is virtually...

Hi @yuanqing-wang, I'd recommend taking a look at our [CONTRIBUTING.md](https://github.com/pyro-ppl/pyro/blob/dev/CONTRIBUTING.md), and in particular: - `make format` and `make lint` to work past linting errors - add torchcontrib to the EXTRAS_REQUIRE...