pointnet2
pointnet2 copied to clipboard
missing data for part segmentation test.py
Hello. I have been trying part segmentation code on the shapenetcore_partanno_segmentation_benchmark_v0_normal dataset and both training and evaluation works fine. However, when I try to test the trained network using test.py script from part_seg folder it fails to execute due to missing folders "points" and "points_label" which are required by part_dataset.py. Here is the error I am getting
Traceback (most recent call last):
File "test.py", line 36, in
I also faced the very same error.. where are points and points_label folders in the dataset ''shapenetcore_partnno_segmentation_benchmark_v0_normal". I downloaded the dataset from "https://1drv.ms/u/s!ApbTjxa06z9CgQnl-Qm6KI3Ywbe1" provided in readme file. Waiting for your kind response.
same here, missing points and points_label for test.py. If you could update the dataset would be great!
Hi!
Same error here. What's the solution? Waiting for help!!! Thanks!
Traceback (most recent call last): File "test.py", line 38, in <module> TEST_DATASET = part_dataset.PartDataset(root=DATA_PATH, npoints=NUM_POINT, classification=False, class_choice=FLAGS.category, split='test') File "/home/kx/project/3D/pointnet2-master/part_seg/part_dataset.py", line 50, in __init__ fns = sorted(os.listdir(dir_point)) FileNotFoundError: [Errno 2] No such file or directory: '/home/kx/project/3D/pointnet2-master/data/shapenetcore_partanno_segmentation_benchmark_v0_normal/02691156/points'
+1
Same problem here, Waiting for help...
I also face the same problem,and don't know how to solve it. @ilya0ics ,did you solve it?
I make the test dataset myself with the sample of trainning dataset, but it show the pointcloud with the wrong prediction. And in the test.py, I found that there're not take the point normal in test dataset, isn't it?
Hi,I had the same problem.I am looking forward to your answers. If you had solved this problem.
Traceback (most recent call last):
File "/home/sheng/PycharmProjects/pointnet2-master/part_seg/test.py", line 41, in
You can read the readme of pointNet1... https://github.com/charlesq34/pointnet there is the correct download link: http://web.stanford.edu/~ericyi/project_page/part_annotation/index.html
@dogod621 First of all, thank you very much for your kind help and reply! As you said, I try to download datasets that mimic pointnet/part_seg/download_data.sh. wget https://shapenet.cs.stanford.edu/ericyi/shapenetcore_partanno_v0.zip. So I downloaded the dataset shapenetcore_partanno_segmentation_benchmark_v0 to replace shapenetcore_partanno_segmentation_benchmark_v0_normal. However I don't know if thy are different . Because one without _normal and one with the _normal at the end of the name. After all , I have trained pointnet2/part_set/train.py with the dataset as the coding writed DATA_PATH = os.path.join(ROOT_DIR, 'data', 'shapenetcore_partanno_segmentation_benchmark_v0_normal') . I can only assume that they are the same and used it to replace . Howerve ,When I executed test.py again , It has the following error: Traceback (most recent call last):
File "test.py", line 95, in
@dogod621 To solve the above error, I changed test.py by use loss = MODEL.get_loss(pred, labels_pl) to replace loss = MODEL.get_loss(pred, labels_pl, end_points) , amazing the error is solved. I executed test.py again, however, it occurred the following error : Traceback (most recent call last):
File "test.py", line 97, in
Hi, I guess that test.py is abandoned function.
@shengrongjin @leejiajun , I meet the same problem! Could you mind sharing the method of solving this problem? Thanks a lot!
I make the test dataset myself with the sample of trainning dataset, but it show the pointcloud with the wrong prediction. And in the test.py, I found that there're not take the point normal in test dataset, isn't it?
Did you use your data to test? If so, can you sharing the method of making the dataset for the network? Thanks.
This is because the default train.py trains the model on the Shapenet dataset with normals (meaning 6 pts inputs instead of 3 pts). I have a working test.py for the dataset with normals on my fork. It will create obj files just like pointnet and print the final accuracy.