semi-supervised-pytorch
semi-supervised-pytorch copied to clipboard
Implementations of various VAE-based semi-supervised and generative models in PyTorch
> before ``` \begin{align} \log p(x) &= \log \sum_{y} \int q(z,y|x) \frac{p(x, y, z)}{q(z,y|x)} \ dz \geq \sum_{y} q(y|x) \int q(z|x, y) \log \frac{p(x, y, z)}{q(z,y|x)} \ dz\\ &= \sum_{y}...
I don't quite understand the line `sample = torch.mean(sample, dim=1)` ([link](https://github.com/wohlert/semi-supervised-pytorch/blob/7cde4959468d271552febdcfed5e1cfae9857613/semi-supervised/layers/stochastic.py#L97)) defined in the `forward` of class `GumbelSoftmax(Stochastic)`; it takes mean over `dim=1` which is `self.n_distributions`, leaving the `self.out_features` which...
Bugs: [one](https://github.com/wohlert/semi-supervised-pytorch/blob/master/semi-supervised/layers/stochastic.py#L44), [two](https://github.com/wohlert/semi-supervised-pytorch/blob/master/semi-supervised/layers/stochastic.py#L64). As it's now it leads to log_var being positive only. But for standard case of standard normal distribution almost all q distribution `log_var`s are less than zero....
On [this line](https://github.com/wohlert/semi-supervised-pytorch/blob/master/semi-supervised/models/vae.py#L135) the code: qz = log_gaussian(z, mu, log_var) - sum(log_det_z) The first one and `log_det_z` are of batch size dim but the `sum(log_det_z)` is a scalar. Actually the...