Is there any suggestions to train large scene_scale dataset with MCMC?
gsplat 1.4.0
python3 examples/simple_trainer.py mcmc --use_bilateral_grid --data_factor 1 --data_dir data/test123/ --result_dir exports/test123/
part of logs
...
[Parser] 120 images, taken by 120 cameras.
Scene scale: 2009.6237303032315
Model initialized. Number of GS: 159431
...
loss=0.475| sh degree=0| : 100%|██▋| 299/30000 [00:25< ......
......
loss=0.185| sh degree=3| : 25%|██████████████████████▌ | 7600/30000 [13:15<39:05, 9.55it/s]
Traceback (most recent call last):
File "/home/ubuntu/gsplat-1.4.0/examples/simple_trainer.py", line 1120, in
It seems gsplat will crash or generate a nothing but noise ply file if change opacity_reg to 0.001, when dataset scene_scale is larger than 2000 or 10000. all of cases is running with high traning loss. I have three datasets from AI tool with this situation here, both these datasets work well with colmap without any errors or warnings. and I nerver encounter such situation with traditional colmap dataset. thanks.
Current the problem is that loss doesn't converge, it stopped on 0.1 or 0.2. I have tried noise-lr, opacity_reg, scale_reg and filter theose zero-alive GS cases etc.
This is the dataset with above situation
120 images 120 cameras 159431 points Source: colmap image_undistorter
download, 183M, Google Drive https://drive.google.com/file/d/1pn8b74AAGobQI-Bxv6arLA-B6-l8IIWQ/view?usp=sharing