nerfstudio
nerfstudio copied to clipboard
Mesh export
CLI interface looks like the following:
For TSDF
ns-export tsdf --load-config config.yml --output-dir exports
Texturing from a mesh file
python scripts/texture.py --load-config config.yml --input-ply-filename filename.ply --output-dir exports
This is what the UV texture image look like. Each mesh face has a unique location.

How do we do it? We first construct the UV texture image. At the center of every pixel in the texture image, we find the corresponding XYZ position and normal vector with barycentric interpolation. Then we render a short ray from the NeRF to set the RGB value.
Example with the plane scene, with a specified bounding box
ns-export tsdf --load-config outputs/data-nerfstudio-plane/nerfacto/2022-11-18_124018/config.yml --output-dir exports/plane --downscale_factor 8 --resolution 128 --use-bounding-box True --bounding-box-min -0.5 -0.55 -0.25 --bounding-box-max 0.5 0.55 0.15 --px_per_uv_triangle 8
Oh, is the aim of this PR to create the mesh from the input data? Not from the NeRF density field itself, as done here https://github.com/ashawkey/torch-ngp/blob/4ae16168f69b005bf4c375ff424ef9d81bc397d7/nerf/utils.py#L571?
Oh, is the aim of this PR to create the mesh from the input data? Not from the NeRF density field itself, as done here https://github.com/ashawkey/torch-ngp/blob/4ae16168f69b005bf4c375ff424ef9d81bc397d7/nerf/utils.py#L571
The goal is to extract a mesh from an existing NeRF model. We will probably support different method of extraction (such as from density fields). However our first method will extract a point cloud and convert it to a mesh, we have found that this often leads to better results.
I see. I am very interested in mesh from NeRF model. I think mesh from density field is kinda hard, and doesn't look that great most of the time.
I've seen other papers try other representations which seem to be better for surfaces (e.g. SDF? output alphas instead of density?) and I'd love to try those out in nerfstudio to see how they are.
Maybe even textured mesh? 👀
Nice work!
Are you aware that the NVDiffRast license would taint the nerfstudio license, and make it usable commercially only by NVidia (at least that's what I understand from 3.3, since nerfstudio becomes a derivative work). How easy would it be to switch to PyTorch3D (BSD), or even neural_renderer (MIT)?
@devernay thanks for the comments! We agree with you. I've removed nvdiffrast from the repo and am using PyTorch instead without additional libraries. It's a bit slower but I had to do this to resolve some edge artifacts that nvdiffrast had when sampling texture coordinates not contained within triangle faces.
FYI - with https://github.com/nerfstudio-project/nerfstudio/pull/809/commits/eca7027daf0cab8fc4bd14ab37b018169835d3a1 I am seeing errors trying to load prior checkpoints trained with 0.1.11 into viewer with ns-train... but ns-export commands do work.
RuntimeError: Error(s) in loading state_dict for VanillaPipeline:
Missing key(s) in state_dict: "_model.field.mlp_pred_normals.params", "_model.field.field_head_pred_normals.net.weight", "_model.field.field_head_pred_normals.net.bias".
FYI - with eca7027 I am seeing errors trying to load prior checkpoints trained with 0.1.11 into viewer with ns-train... but ns-export commands do work.
RuntimeError: Error(s) in loading state_dict for VanillaPipeline: Missing key(s) in state_dict: "_model.field.mlp_pred_normals.params", "_model.field.field_head_pred_normals.net.weight", "_model.field.field_head_pred_normals.net.bias".
Looks like turning on predict-normals by default caused this;
older datasets can be loaded by specifying --pipeline.model.predict-normals False
(I also found https://github.com/nerfstudio-project/nerfstudio/pull/1057 necessary, but I think that is an attribute of latest main, and not the changes in this PR.)
Not sure what expectations are generally for backward compatibility in this project... if expected, probably want to fix; but if not, then messaging to add that parameter should suffice.
@machenmusik Good catch. We need to predict normals for poisson meshing to work well. We're still debating whether to default to predicting normals or not going forward. TSDF meshing will work without predicting normals.
@ethanweber thanks. I would note that I tried with a nerfacto trained checkpoint, so it seems that poisson will only work for nerfacto when predict-normals was true during training.
So if the idea is to default to tsdf so the issue will only occur when explicitly trying the combination that won't work, it should be fine to just have a slightly more helpful error message (e.g. you didn't train this one with predict-normals, so use something else like tsdf instead)
(It appears that changing the predict-normals default to true also means that older nerfacto output checkpoints need to be loaded with --pipeline.model.predict-normals False -- even if one isn't going to export with poisson -- so it might be better to revert that change for backward compatibility IMO.)
I agree with your comments @machenmusik and will make the changes to have better defaults and documentation. Thanks for the thoughts!
Thanks @ethanweber! Upon further reflection, it seems needlessly cumbersome to force the user to specify the correct predict-normals setting when there's only one way it will work for ns-train checkpoint loading (and it's not obvious which it should be), so I created issue #1063 in case that behavior is easy to improve.