learned-triangulation
learned-triangulation copied to clipboard
DataSet
Could you provide your DataSet when you training model used and the test.ply?
Thanks!
For the training dataset in the paper, we used ShapeNetV2. You can download it directly from the dataset website. We used the train/test splits that come with the dataset.
Once you have ShapeNet, the README gives commands which generate the patch-sampled datasets like we trained on, such as:
python src/generate_local_points_dataset.py --input_dir=/path/to/train_meshes/ --output_dir=data/train/ --n_samples=20000
this will output the training data in to data/train
(or whatever path you are using). Note that you may also need to set this path in main_train_model.py
.
(additionally, a set of pre-trained model weights are included in the repo!)
I'm not sure what you mean by test.ply
? Can you clarify?
Thanks for response! The test.ply I mean just like airplane, chair, lamp and so on that been used in this paper.
No problem :)
The airplanes/chairs/lamps are all from the ShapeNetV2 test dataset. If you want the specific model number for the ones that appeared in our figures, I'd have to see if I have it written down anywhere, though I'm not sure I do since we prepared the paper a while ago.
Thanks! I also want to know that "fill small holes" if this step is default to use when i run python src/main_generate_mesh.py path_to_your_cloud_or_mesh.ply --output result?
Do pointcloud classfication first like pointNet to create more reference if it's feasible to improve candidate triangle result? I mean to reduce the loss triangle of triangulation model.
"fill small holes" if this step is default
The script outputs produces multiple outputs, the one with "_closed" in the name has had the small-hole filling procedure run on it.
classfication first like pointNet
The networks in our method are structured like point nets, but they are applied only at triangles for classifications. It could indeed be interesting to perform additional point cloud learning and annotate the cloud with features or discard outliers, etc. I have not tried that.
Does this paper have presentation PPT?
Project materials (including a talk video) are available here: https://nmwsharp.com/research/learned-triangulation/ . Raw slide files are not currently released.
Thanks for sharing your great work!
There is a question I am confused. The dataset used in your paper is the core-version or the segmentation-version or others in ShapeNet dataset? I use the core-version to prepare the data, but I find the file structure is not consistent with the code(generate_local_points_dataset.py).
Looking forward to your reply.
Hi!
The dataset used is the core version. The data preparation code (generate_local_points_dataset.py
), expects a simple flat directory of files as input; you may need to slightly move some files around to set this up from ShapeNet. For instance, I believe we did a train/val/test split and separated those meshes in to separate folders.
Hi!
The dataset used is the core version. The data preparation code (
generate_local_points_dataset.py
), expects a simple flat directory of files as input; you may need to slightly move some files around to set this up from ShapeNet. For instance, I believe we did a train/val/test split and separated those meshes in to separate folders.
Thanks for your quick reply. To be consistent with your paper, Can your provide the train/val/test list, cause I don't find the split file (I can just find a json file named taxonomy.json).
Best, Zhu
No problem!
The file with train/test/val splits for ShapeNetCore is hosted separately here: http://shapenet.cs.stanford.edu/shapenet/obj-zip/SHREC16/all.csv (I have no idea why they don't include it in the zip, perhaps because these splits were released as part of of a separate effort).
-Nick
No problem!
The file with train/test/val splits for ShapeNetCore is hosted separately here: http://shapenet.cs.stanford.edu/shapenet/obj-zip/SHREC16/all.csv (I have no idea why they don't include it in the zip, perhaps because these splits were released as part of of a separate effort).
-Nick
I see. Thanks! Zhu