DIG
DIG copied to clipboard
Error on tutorial for graph generation
Description
Hello, I'm trying to run the Tutorial for Graph Generation example available on the documentation. However, I'm having an error when running the code described.
I don't know if it could be something related to the version of the libraries used or something else.
Here is a code to reproduce the error:
import json
from rdkit import RDLogger
from rdkit.Chem import Draw
from dig.ggraph.method import GraphDF
from dig.ggraph.dataset import ZINC250k
from dig.ggraph.evaluation import RandGenEvaluator
from torch_geometric.loader import DenseDataLoader
RDLogger.DisableLog("rdApp.*")
conf = json.load(open("rand_gen_zinc250k_config_dict.json"))
dataset = ZINC250k(one_shot=False, use_aug=True)
loader = DenseDataLoader(dataset, batch_size=conf["batch_size"], shuffle=True)
runner = GraphDF()
lr = 0.001
wd = 0
max_epochs = 10
save_interval = 1
save_dir = "rand_gen_zinc250k"
runner.train_rand_gen(loader=loader, lr=lr, wd=wd, max_epochs=max_epochs,
model_conf_dict=conf["model"],
save_interval=save_interval, save_dir=save_dir)
This is the Traceback I received.
/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch_geometric/data/in_memory_dataset.py:284: UserWarning: It is not recommended to directly access the internal storage format `data` of an 'InMemoryDataset'. If you are absolutely certain what you are doing, access the internal storage via `InMemoryDataset._data` instead to suppress this warning. Alternatively, you can access stacked individual attributes of every graph via `dataset.{attr_name}`.
warnings.warn(msg)
Traceback (most recent call last):
File "/media/takaogahara/storage1/MolGen/teste.py", line 23, in <module>
runner.train_rand_gen(loader=loader, lr=lr, wd=wd, max_epochs=max_epochs,
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/dig/ggraph/method/GraphDF/graphdf.py", line 67, in train_rand_gen
for batch, data_batch in enumerate(loader):
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 630, in __next__
data = self._next_data()
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 674, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in <listcomp>
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/torch_geometric/data/dataset.py", line 263, in __getitem__
data = self.get(self.indices()[idx])
File "/home/takaogahara/virtualenvs/dig/lib/python3.10/site-packages/dig/ggraph/dataset/PygDataset.py", line 171, in get
for key in self.data.keys:
TypeError: 'method' object is not iterable
Environment
- PyG version: 2.4.0
- PyTorch version: 2.1.0
- DIG version: 1.1.0
- OS: Ubuntu 22.04
- Python version: 3.10
- CUDA/cuDNN version: cu121
for key in self.data.keys: -> for key in self.data.keys()
works for me
@Takaogahara Thank you for pointing this bug! Actually ''for key in self.data.keys'' follows the implementaion of InMemoryDataset class in pyg 1.x, but pyg 2.x updates the implementation and I guess ''keys'' is no longer an attribute in ''self.data'' in pyg 2.x now. @irumeria Thank you very much for providing fix suggestion! I have changed ''self.data.keys'' to ''self.data.keys()''.