Pixel2Mesh icon indicating copy to clipboard operation
Pixel2Mesh copied to clipboard

How does converting the .xyz file to a .dat file work ?

Open Michael-H1302 opened this issue 2 years ago • 5 comments

If I have understood everything correctly the generate_data.py file generates the .xyz files in the rendering folder. But these files can't be used for training yet. They need to be converted to .dat files. In another issue I read that the .dat files are just a binary wrapper for the .xyz files and you should use pickle to create them. But if I use pickle as following:

import pickle
import numpy as np

content = []

with open('00.xyz', 'r') as f:
    for line in f.readlines():
            content.append(line.split(' '))

with open('00.dat', 'wb') as f:
        pickle.dump(content, f)

the created .dat file looks different. If you open a .dat file from the ShapeNet training data provided from the developers via google drive you see that every .dat file starts with

cnumpy.core.multiarray
_reconstruct

but my generated .dat file doesn't.

A snippet of a code which converts the .xyz files into .dat files that are usable for training would be great. Does anyone know what I am doing wrong ?

Thanks in advance.

Michael-H1302 avatar Oct 19 '22 12:10 Michael-H1302

Have you solved the problem of data set generation? Can I ask you for help

jndmw-111 avatar Nov 03 '22 06:11 jndmw-111

I have the same problem and want to generate.dat file for data set. Have you solved it? Can you share the solution?

ShusenWang123456 avatar Nov 27 '22 08:11 ShusenWang123456

Yes I think I solved it. I wrote myself a short python program that converts the .xyz files into .dat files. You will have to give the program the path to your .xyz file as a command line argument. The program itself uses pickle to convert the .xyz into a .dat file and will drop that .dat file into the same directory where your .xyz file is located.

import pickle
import numpy as np
import sys

def myFunc():
    content = []
    with open(sys.argv[1], 'r') as f:
        for line in f.readlines():
                content.append(line.split(' '))

    print(content[0][0])

    data = []
    for i in range(len(content)):
        data.append([])
        for k in range(len(content[i])):
            data[i].append(float(content[i][k]))

    output_path = sys.argv[1]
    output_path = output_path[:output_path.rfind('/') + 1] + output_path[output_path.rfind('/') + 1:output_path.rfind('.')]
    output_path = output_path + ".dat"
    print(output_path)
    with open(output_path, 'wb') as f:
        try:
            data = np.array(data)
            print(data.shape)
            pickle.dump(data, f, 2)
        except pickle.PicklingError:
            print('Error while reading from object. Object is not picklable')


myFunc()

Michael-H1302 avatar Dec 10 '22 11:12 Michael-H1302

Hi, I find the generate_dada.py also can sample the point from ground truth surface. Do you know the sampling operation difference between generate_dada.py and 1_sample_points.txt? I guess the sample in generate_dada.py is uniform. But if I want to generate the ground truth data in point form to compare with prediction, whether I just need to run the generate_dada.py? Or do I need to follow the step including 1_sample_points.txt, 2_generate_normal.py, 3_camera_transform.py?

WayneCV avatar May 16 '23 08:05 WayneCV

Hi, I find the generate_dada.py also can sample the point from ground truth surface. Do you know the sampling operation difference between generate_dada.py and 1_sample_points.txt? I guess the sample in generate_dada.py is uniform. But if I want to generate the ground truth data in point form to compare with prediction, whether I just need to run the generate_dada.py? Or do I need to follow the step including 1_sample_points.txt, 2_generate_normal.py, 3_camera_transform.py?

Hi, have you known the sampling operation difference between generate_dada.py and (1_sample_points.txt, 2_generate_normal.py, 3_camera_transform.py?). They seem to achieve the same function

Michaelwjh avatar Aug 30 '23 08:08 Michaelwjh