SubGNN icon indicating copy to clipboard operation
SubGNN copied to clipboard

Subgraph Neural Networks (NeurIPS 2020)

Results 14 SubGNN issues
Sort by recently updated
recently updated
newest added

1. -python-graphviz==0.14 should be -graphviz==0.14. 2. -torch==1.0.0 should be - torch==1.4.0. 3. The cuda version of pyg is error.

The package ["python-graphviz"](https://github.com/mims-harvard/SubGNN/blob/eae0848b823cade135428b0ac264b7caa31862a1/SubGNN.yml#L174) does not exist in pip inventory. I assume the correct version is "[graphviz](https://pypi.org/project/graphviz/)" instead.

First of all, I would like to thank you for your great research. This study enabled me to analyze more data. I'm facing another problem now. when I try to...

Hi everyone, do someone has the same issue? I create the environment as `conda env create --file SubGNN.yml ` but when I test the code using one of the Real-World...

When run `python prepare_dataset.py` ONCE **CONV =GIN and MINIBATCH =NeighborSampler** Regardless of the real-world dataset used, I got the following error message Using device: cuda GeForce GTX 1080 Graph density...

When CONV =GIN and MINIBATCH =NeighborSampler, I get the following error message (using torch 1.4.0 and CUDA 10.1, torch-sparse=0.6.1) Can you please suggest if there's anything I am missing here?...

In SubGNN/best_model_hyperparameters/em_user/hyperparams.json for em_user dataset I find that ``` "use_neighborhood": true, "use_structure": false, "use_position": false, ``` https://github.com/mims-harvard/SubGNN/blob/eae0848b823cade135428b0ac264b7caa31862a1/best_model_hyperparameters/em_user/hyperparams.json#L3 https://github.com/mims-harvard/SubGNN/blob/eae0848b823cade135428b0ac264b7caa31862a1/best_model_hyperparameters/em_user/hyperparams.json#L4 https://github.com/mims-harvard/SubGNN/blob/eae0848b823cade135428b0ac264b7caa31862a1/best_model_hyperparameters/em_user/hyperparams.json#L5 however in other three dataset, that looks like bellow ``` "use_neighborhood":...

I want to repeat experiment results, and when try to use ppi_bp dataset to train the model, ` python train_config.py -config_path config_files/ppi_bp/ppi_bp_config.json ` I get the following message: ``` Running...

Hi, I'm wondering if there is a config file for training the synthetic dataset? Thank you.

Nice work! From the paper, I see there are only four real-world datasets. Is there any suggestions for more datasets that can be used for SubGNN? Thanks a lot!