nerfstudio
nerfstudio copied to clipboard
[WIP] dreamfusion implementation
@terrancewang I'd try to structure this implementation as a Pipeline since the Model (with Fields) should be largely agnostic to DreamFusion. We should be able to swap out Models that have normals implemented.
@ethanweber I think the larger issue is how we're going to have to do the loss function since the implementation will require more than the standard .backward() call as dreamfusion directly calculates the gradients instead of having a real "loss" function. The stable-dreamfusion code's https://github.com/ashawkey/stable-dreamfusion/blob/15c2f105a6fd659cc1bc72c13e2168d74678753c/nerf/sd.py#L116 has a backward function that is different due to the fact that the dreamfusion loss basically ignores the gradient through the u-net. We can jank this by just doing this backward call / dreamfusion gradient computation in the pipeline's get_train_loss_dict() and then having the trainer class still step the optimizer for the radiance field, but does that sit ok with you that we do it this way or do you have any ideas on how this can fit nicely into the current system?
@ethanweber I think the larger issue is how we're going to have to do the loss function since the implementation will require more than the standard .backward() call as dreamfusion directly calculates the gradients instead of having a real "loss" function. The stable-dreamfusion code's https://github.com/ashawkey/stable-dreamfusion/blob/15c2f105a6fd659cc1bc72c13e2168d74678753c/nerf/sd.py#L116 has a backward function that is different due to the fact that the dreamfusion loss basically ignores the gradient through the u-net. We can jank this by just doing this backward call / dreamfusion gradient computation in the pipeline's get_train_loss_dict() and then having the trainer class still step the optimizer for the radiance field, but does that sit ok with you that we do it this way or do you have any ideas on how this can fit nicely into the current system?
Maybe we could specify generative vs. non-generative pipelines and generative pipelines could do custom backward steps?
Not sure if expected, but I got a warning about UNet, and to run scripts\trace_stablediff.py
-- eventually got that to work and then saw Loading traced UNet.
-- but I wonder if that should become part of install process.
Not sure if expected, but I got a warning about UNet, and to run
scripts\trace_stablediff.py
-- eventually got that to work and then sawLoading traced UNet.
-- but I wonder if that should become part of install process.
Thats expected. It will eventually be part of the install instructions
Thats expected. It will eventually be part of the install instructions
ok np.
Suggestion: given that the iteration times are 1 to 2 orders of magnitude longer, it may be worth changing the default checkpoint interval for dreamfusion from 2000 to something much smaller.
(note: with 16GB VRAM, still getting occasional CUDA OOM, even after training is complete and only viewing.)
Hi @tancik,
How is the project going? Will I be able to use it out of the box?
I get the following warning when running other models like nerfacto (not observed in main)
ns-train nerfacto --data data/nerfstudio/egg
/home/tancik/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_fields.py:794: UserWarning: Mutable type <class 'nerfstudio.engine.optimizers.AdamOptimizerConfig'> is used as a default value for `optimizer`. This is dangerous! Consider using `dataclasses.field(default_factory=...)` or marking <class 'nerfstudio.engine.optimizers.AdamOptimizerConfig'> as frozen.
warnings.warn(
/home/tancik/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_fields.py:794: UserWarning: Mutable type <class 'nerfstudio.engine.schedulers.ExponentialDecaySchedulerConfig'> is used as a default value for `scheduler`. This is dangerous! Consider using `dataclasses.field(default_factory=...)` or marking <class 'nerfstudio.engine.schedulers.ExponentialDecaySchedulerConfig'> as frozen.
warnings.warn(
example results from generfacto:
a high quality zoomed out photo of a palm tree
https://github.com/nerfstudio-project/nerfstudio/assets/19509183/05ffebce-a3d6-43af-9f11-e04ce2ce3237
a high quality photo of a ripe pineapple
https://github.com/nerfstudio-project/nerfstudio/assets/19509183/407ff7c8-7106-4835-acf3-c2f8188bbd1d
a high quality zoomed out photo of a light grey baby shark
https://github.com/nerfstudio-project/nerfstudio/assets/19509183/b1f5b7c5-dd96-48b4-8db0-960632e7798b
@terrancewang could we have the examples above in the nerfstudio documentation?
@terrancewang could we have the examples above in the nerfstudio documentation?
yup i'll add these vids to the docs as well soon!