Dustin Tran
Dustin Tran
thanks for catching that. my prediction is that this fix in 1.1.6 (https://github.com/blei-lab/edward/pull/373) made it work.
KLqp uses a form of black box variational inference, leveraging the score function gradient. This is known to be difficult to work with Gamma distributions. The wild variations in the...
only through research unfortunately. this is what motivated the blei lab's work on [rejection sampling variational inference](https://arxiv.org/abs/1610.05683). it tries to better handle black box gradients for dirichlet's and gamma's. in...
that's an excellent suggestion. we should definitely add those warnings (keeping this issue open for that reason). more generally, its important to describe warnings on research limitations/soft constraints for each...
Thanks for asking Hiroki. Can you raise an issue on Github and provide a snippet detailing some difficulties you're having? In general, I would recommend a fully factorized log-Normal distribution—at...
I'm not sure how Keras' noise works. Does `noised_x = inp + tf.random_normal(tf.shape(inp), stddev=self.x_noise_std)` work?
Thanks for looking into this. It does seem to be a non-trivial solution.
Yeah, sorry for the delays. These past weeks have been especially busy due to NIPS. The TF best practice is to not rely on internal functions; unfortunately we were forced...
I agree. I'm looking into it now and that seems to be the problem with duplicating code. It looks like the `tf.contrib.graph_editor` ran into the same problems due to the...
@AustinRochford: yeah that would be fantastic if you could.