4d-gaussian-splatting
4d-gaussian-splatting copied to clipboard
It seems that CUDA is not enabled
The program is strangely slow and it seems that the computation is done on CPU, not cuda (As you can see in the image, CPU utilization is 2108%)
It's quite weird because I checked that parameters in GaussianModel are loaded on cuda.
Modified diff-gaussian-rasterization package is highly suspicious.
100% modified diff-gaussian-rasterization is missing CUDA optimisations / overwritten this is not running on CUDA other then loading.
@jamesjjk Maybe I have to compare the code with the original one before authors reply the issue
Thanks for your interest. It's impractical to run CUDA kernel on CPU. I suspect the I/O bottleneck is the culprit for your slowly training. If you have sufficient memory, you can try to disable the dataloader.
same here... training on dynerf dataset is extremely slow (~1.12it/s) also, loss backward done per instance within a batch was also confusing and slow...
same problem. 1.3it/s on RTX 3090
It seem to KNN's problem. If you disable rigid loss, it will be faster.