spektral
spektral copied to clipboard
how to deal with large datasets in custom_dataset.py?
Hi,
can you give more details about the issue that you are having?
yes, about graphs binary classification problem, and i have a lots of Adjacency matrix and node features(files is 50G),how can i use generator load into memory?
Hey @zhangweizhenGitHub I was facing a similar issue some days back in my domain I have around 1.5 million images and using loaders is crashing the memory. Did you find any solution?
Any solution on this? @danielegrattarola Thank you
PR #292 should show how to load data from disk, but I have not had the time to review it properly (also, it's for image data).
I'll get around to it eventually, I'm really swamped with PhD duties lately :D