nerfstudio icon indicating copy to clipboard operation
nerfstudio copied to clipboard

Is there a good method for mesh quality?

Open hanjoonwon opened this issue 2 years ago • 5 comments

Is there a good way to improve the mesh quality? ns-train nerfacto --pipeline.model.predict-normals True --data output/scan --pipeline.datamanager.train-num-rays-per-batch 2000 I'm currently doing this, but I don't know if the Poisson mesh quality is much better than instant-ngp windows.

hanjoonwon avatar Jan 04 '24 10:01 hanjoonwon

Meshes generated by any Nerf will be quite rough. There isnt much you can do for smooth meshes. Use neus and other methods supported by SDFStudio.

Ar2000S avatar Jan 11 '24 16:01 Ar2000S

Meshes generated by any Nerf will be quite rough. There isnt much you can do for smooth meshes. Use neus and other methods supported by SDFStudio.

Thanks :) , I tried to run sdfstudio using nerf process data with depth,normal,rgb,foregroundmask, but a completely different mesh came out, so I wondered if I could get a good mesh from nerfstudio that works normally, but i have no choice.

hanjoonwon avatar Jan 12 '24 03:01 hanjoonwon

Can you show me the mesh generated by sdfstudio and nerfstudio? The initialization parameters must be set properly when using sdfstudio or the results will look ugly. Here is an example. The first picture shows the mesh generated from depth-nerfacto->Poisson. The second picture shows the mesh generated from neus-facto.
Screenshot 2024-01-12 130208 Screenshot 2024-01-12 130615

Ar2000S avatar Jan 12 '24 04:01 Ar2000S

Can you show me the mesh generated by sdfstudio and nerfstudio? The initialization parameters must be set properly when using sdfstudio or the results will look ugly. Here is an example. The first picture shows the mesh generated from depth-nerfacto->Poisson. The second picture shows the mesh generated from neus-facto. Screenshot 2024-01-12 130208 Screenshot 2024-01-12 130615

(https://github.com/autonomousvision/sdfstudio/issues/277) here is my problem i edit foreground mask=true and add foreground path my dataset is like this jsonfile and input data in docker

json.txt

image

hanjoonwon avatar Jan 13 '24 03:01 hanjoonwon

Can you show me the mesh generated by sdfstudio and nerfstudio? The initialization parameters must be set properly when using sdfstudio or the results will look ugly. Here is an example. The first picture shows the mesh generated from depth-nerfacto->Poisson. The second picture shows the mesh generated from neus-facto. Screenshot 2024-01-12 130208 Screenshot 2024-01-12 130615

@Ar2000S How can i set initial parameter properly??

hanjoonwon avatar Jan 27 '24 14:01 hanjoonwon

Can you show me the mesh generated by sdfstudio and nerfstudio? The initialization parameters must be set properly when using sdfstudio or the results will look ugly. Here is an example. The first picture shows the mesh generated from depth-nerfacto->Poisson. The second picture shows the mesh generated from neus-facto. Screenshot 2024-01-12 130208 Screenshot 2024-01-12 130615

My Neus result is totally worse

hanjoonwon avatar Feb 13 '24 05:02 hanjoonwon

Sorry for the (very) late reply. I tried to run mono-neus and also neus-facto on your dataset. The result, in short, came out to be an egg. image The parameters I used usually give a rough shapeain around 5k iterations, but your dataset did'nt. Maybe it is because of the extremely small image size? Can you provide the original RGB images? I'll see if that does beter. As for the foreground, you will have to add the filepaths in the meta_data as well. These are the paras I used. --pipeline.model.sdf-field.geometric-init True --pipeline.model.background-model mlp --pipeline.model.sdf-field.use-grid-feature True --pipeline.model.sdf-field.use-appearance-embedding True --pipeline.model.sdf-field.inside-outside False --pipeline.model.mono-normal-loss-mult 0.01 --pipeline.model.mono-depth-loss-mult 0.01 --pipeline.model.near-plane 0.05 --pipeline.model.far-plane 100 --pipeline.model.overwrite-near-far-plane True --pipeline.model.sdf-field.bias 0.3 --pipeline.model.eikonal-loss-mult 0.01 --pipeline.model.sdf-field.hash-features-per-level 2 --pipeline.model.sdf-field.num-layers 1 --pipeline.model.sdf-field.position-encoding-max-degree 8

Ar2000S avatar Feb 14 '24 15:02 Ar2000S

Sorry for the (very) late reply. I tried to run mono-neus and also neus-facto on your dataset. The result, in short, came out to be an egg. image The parameters I used usually give a rough shapeain around 5k iterations, but your dataset did'nt. Maybe it is because of the extremely small image size? Can you provide the original RGB images? I'll see if that does beter. As for the foreground, you will have to add the filepaths in the meta_data as well. These are the paras I used. --pipeline.model.sdf-field.geometric-init True --pipeline.model.background-model mlp --pipeline.model.sdf-field.use-grid-feature True --pipeline.model.sdf-field.use-appearance-embedding True --pipeline.model.sdf-field.inside-outside False --pipeline.model.mono-normal-loss-mult 0.01 --pipeline.model.mono-depth-loss-mult 0.01 --pipeline.model.near-plane 0.05 --pipeline.model.far-plane 100 --pipeline.model.overwrite-near-far-plane True --pipeline.model.sdf-field.bias 0.3 --pipeline.model.eikonal-loss-mult 0.01 --pipeline.model.sdf-field.hash-features-per-level 2 --pipeline.model.sdf-field.num-layers 1 --pipeline.model.sdf-field.position-encoding-max-degree 8 @Ar2000S https://drive.google.com/drive/folders/1Q9zSM8sCsQ4n-6QKl4GEujuznx-abxvh Here is my battety images

hanjoonwon avatar Feb 15 '24 13:02 hanjoonwon