HashNeRF-pytorch icon indicating copy to clipboard operation
HashNeRF-pytorch copied to clipboard

sparsity_loss = entropy

Open wtj-zhong opened this issue 1 year ago • 4 comments

When I run this code with the synthetic Lego dataset, it works fine. But when I run it with the llff dataset, I encounter the following issue: python run_nerf.py --config configs/fren.txt --finest_res 512 --log2_hashmap_size 19 --lrate 0.01 --lrate_decay 10 0 0.0010004043579101562 [00:00<?, ?it/s] [1/1] [99%] c:\users\nezo\desktop\3d\hashnerf-pytorch\run_nerf.py(379)raw2outputs() -> sparsity_loss = entropy Could you please tell me the reason for this issue?

wtj-zhong avatar Aug 27 '23 11:08 wtj-zhong

I had the same problem. I'm not sure why, but I retrained without changing anything, and it worked. May be a problem in the initialization and optimization of neural network parameters, with randomness?

THUROI0787 avatar Sep 19 '23 11:09 THUROI0787

I also encountered the same problem on the self-made data set. Is there something wrong with the data?

JishuaiZhang avatar Nov 17 '23 14:11 JishuaiZhang

I think the problem is caused by weights = (x - voxel_min_vertex)/(voxel_max_vertex-voxel_min_vertex) of Hash_encoding!You can change it into weights = (x - voxel_min_vertex)/(voxel_max_vertex-voxel_min_vertex+1e-6) and try it again!

Fjzd avatar Nov 17 '23 16:11 Fjzd

May I ask everyone, why do we add this loss function and what is its purpose? image

wuzuyin avatar Nov 23 '23 02:11 wuzuyin