gae-pytorch
gae-pytorch copied to clipboard
Graph Auto-Encoder in PyTorch
Bumps [scipy](https://github.com/scipy/scipy) from 1.0.0 to 1.10.0. Release notes Sourced from scipy's releases. SciPy 1.10.0 Release Notes SciPy 1.10.0 is the culmination of 6 months of hard work. It contains many...
May I know why adding self loop to adj_train to get adj_label? adj_label = adj_train + sp.eye(adj_train.shape[0])
Bumps [numpy](https://github.com/numpy/numpy) from 1.14.0 to 1.22.0. Release notes Sourced from numpy's releases. v1.22.0 NumPy 1.22.0 Release Notes NumPy 1.22.0 is a big release featuring the work of 153 contributors spread...
In VAE, sampling is `z_mean + torch.exp(0.5 * z_log_var) * epsilon` , but why is `z_mean + torch.exp( z_log_var)` in VGAE, does it cause anything different?
I found that in train.py `mu.data.numpy()` is used to get hidden_emb, but it would get None when using GCNModelAE as model, hidden_emb should be got from model.encode() instead.
How can I implement a mini-batch version of GVAE?
`KLD = -0.5 / n_nodes * torch.mean(torch.sum(1 + 2 * logvar - mu.pow(2) - logvar.exp().pow(2), 1))` / n_nodes should be removed or torch.mean → torch.sum
When I train with Cora dataset, I get the following error in `binary_cross_entropy_with_logits`. Shouldn't `pos_weight` be a Tensor? Thanks! ``` Traceback (most recent call last): File "train.py", line 83, in...
`self.dc = InnerProductDecoder(dropout, act=lambda x: x)` Why not use `act=torch.sigmoid`here?