desplat
desplat copied to clipboard
Training Error in Dynamic Splitting When split_mask Has Zero Available Data
During training with custom data and default onthego parameters, split_mask.sum().item() gradually decreases to zero. This triggers a size mismatch error in split_gaussians_dyn when no splittable Gaussians remain. How should we handle cases where split_mask contains zero available data to avoid crashing?
I set split_screen_size: float = 0.04 (default=0.05) in class DeSplatModelConfig to avoid this error. However, the memory usage unexpectedly exceeded 80GB. This is perplexing because I'm using a modified DTU dataset (with added distractors, downsampled to 1/2 resolution) that shouldn't cause Out-Of-Memory (OOM) issues.