superpoint_transformer
superpoint_transformer copied to clipboard
Training custom dataset with all points
Hi, How do I train the model with all the points in a point cloud and not just partial points ?
Please elaborate. What do you mean by "partial points" ?
i dont want to make partitions for my training data. Not xy_tiling or pc_tiling. I have kept these two as null. when i use train data in demo notebook, full point cloud is not processesd. I need to train model on all the points present in point cloud each step of epoch not a partition of point cloud.
In this project "partition" means something very specific (see our paper).
xy_tiling and pc_tiling do not control the partition but, as their name indicates, the tiling. These are useful when the dataset is so large that we cannot fit all of it into memory at once. Said otherwise, most of the time for any realistically large-scale 3D dataset. As you mentioned, setting xy_tiling and pc_tiling to null disables this (but if you are happy with this, it suggests your dataset is small and I have doubts training SPT from scratch on it would make sense).
when i use train data in demo notebook, full point cloud is not processesd
It is though, unless you did not code your Dataset class corectly.
I need to train model on all the points present in point cloud each step of epoch not a partition of point cloud.
It is what the project does, at least at inference. At training time, we use SampleSubNodes, SampleRadiusSubgraphs, SampleSegments to introduce diversity in the batches. Think of those as augmentations. If these are not clear, I invite you to look at the provided dataset configurations in configs/datamodule/semantic/ and get familiar with the transformations applied at train and val/test time. The docstrings of each of those should be fairly explicit.