stable-dreamfusion icon indicating copy to clipboard operation
stable-dreamfusion copied to clipboard

Generation without texture

Open Justinfungi opened this issue 1 year ago • 10 comments

Description

https://user-images.githubusercontent.com/79019929/232335401-4b67e31e-7d7b-4423-abba-85c55be55246.mp4

https://user-images.githubusercontent.com/79019929/232335427-41c5d2b2-1bf3-4f70-85c4-a65a9a9b8297.mp4

Steps to Reproduce

Here are some of my result generated Steps are like

  1. !python main.py --text "a hamburger" --workspace trial -O --vram_O
  2. !python main.py --workspace trial -O --test
  3. !python main.py --workspace trial -O --test --save_mesh

Expected Behavior

But there is not so much texture and details on the result. How can I fix this result and generate a more high quality result

Environment

Ubuntu22.04, NVDA gpu

Justinfungi avatar Apr 16 '23 18:04 Justinfungi

I would like to ask what is the difference between using DMT net fine-tune or just simply training and testing

Justinfungi avatar Apr 16 '23 19:04 Justinfungi

It's very usual for the result to be like this, because the training SDS loss is calculated on a low resolution NeRF output (usually 64*64). By contrast, DMTet is a more memory efficient way to represent 3D scence than NeRF, thus it allows to train in a much higher resolution. Detailed explanation could be found on this paper: Magic3D.

zeng-yifei avatar Apr 17 '23 01:04 zeng-yifei

It's very usual for the result to be like this, because the training SDS loss is calculated on a low resolution NeRF output (usually 64*64). By contrast, DMTet is a more memory efficient way to represent 3D scence than NeRF, thus it allows to train in a much higher resolution. Detailed explanation could be found on this paper: Magic3D.

So does it mean that we need to use DMTet to fine-tune the result after the SDS training? Or they are parallel approach

Justinfungi avatar Apr 17 '23 10:04 Justinfungi

Yep, it's a two-stage procedure (from coarse to fine) instead of a parrallel one. If you want to train a DMTet from scratch, I recommend you to check out this paper: Fantasia3D. However, the code has not been released yet, so the paper can only be a reference in theroy.

zeng-yifei avatar Apr 17 '23 11:04 zeng-yifei

Yep, it's a two-stage procedure (from coarse to fine) instead of a parrallel one. If you want to train a DMTet from scratch, I recommend you to check out this paper: Fantasia3D. However, the code has not been released yet, so the paper can only be a reference in theroy.

It seems the DMTet fail when I feed the training result to it Code

!python main.py -O --text "a rex rabbit with cute tail" \
    --workspace results/rexrabbit_HQ_1_DMTet --dmtet --iters 500\
    --init_ckpt results/rexrabbit_HQ_1/checkpoints/df_ep0100.pth

image image

I don't quite understand why the Ball will generated instead of the rabbit

Justinfungi avatar Apr 17 '23 17:04 Justinfungi

Good question. I also often failed using DMTet lol, it seems the finetuning to be not stable enough. I'm also wondering why this happens. My case is that the result will eventually lose its color. I found the coloring part is still using a nerf result, maybe change the color part to DMTet by utilizing the vertex color will help. But I seldom lose the shape using a DMTet. That's really strange. Maybe we use it in the wrong way lmao

zeng-yifei avatar Apr 18 '23 04:04 zeng-yifei

Good question. I also often failed using DMTet lol, it seems the finetuning to be not stable enough. I'm also wondering why this happens. My case is that the result will eventually lose its color. I found the coloring part is still using a nerf result, maybe change the color part to DMTet by utilizing the vertex color will help. But I seldom lose the shape using a DMTet. That's really strange. Maybe we use it in the wrong way lmao

LMAO, what is your common line for calling the DMTet. I wonder if my epoch is too small. but I do follow the guideline. Are you using the latest epoch of the training steps as the init kept in DMTet

Maybe need to see if the author has any idea @ashawkey

Justinfungi avatar Apr 18 '23 09:04 Justinfungi

Maybe it's related to randomness or prompt, you can try different seeds, or the examples under scripts (like this https://github.com/ashawkey/stable-dreamfusion/blob/main/scripts/run2.sh) to check if these prompts can generate good shapes.

ashawkey avatar Apr 18 '23 13:04 ashawkey

Maybe it's related to randomness or prompt, you can try different seeds, or the examples under scripts (like this https://github.com/ashawkey/stable-dreamfusion/blob/main/scripts/run2.sh) to check if these prompts can generate good shapes.

Ok I will try it. Are you using the initial ckpt instead of the epoch100 ckpt. I saw that your script are using the init ckpt

Justinfungi avatar Apr 19 '23 01:04 Justinfungi

df.pth is always the last checkpoint we saved, so I guess it's the same.

ashawkey avatar Apr 19 '23 13:04 ashawkey