SPIRAL
SPIRAL copied to clipboard
[NeurIPS 2025] SPIRAL: Semantic-Aware Progressive LiDAR Scene Generation and Understanding
Spiral: Semantic-Aware Progressive LiDAR Scene Generation and Understanding
Dekai Zhu*
Yixuan Hu*
Youquan Liu
Dongyue Lu
Lingdong Kong
Slobodan Ilic
(* Equal Contribution)
NeurIPS 2025
![]() |
|---|
Existing LiDAR generative models are limited to producing unlabeled LiDAR scenes, lacking any semantic annotations. Performing post-hoc labeling on these generated scenes requires additional pretrained segmentation models, which introduces extra computational overhead. Moreover, such after-the-fact annotation yields suboptimal segmentation quality.
To address this issue, we make the following contributions:
- We propose a novel state-of-the-art semantic-aware range-view LiDAR diffusion model, Spiral, which jointly produces depth and reflectance images along with semantic labels.
- We introduce unified evaluation metrics that comprehensively evaluate the geometric, physical, and semantic quality of generated labeled LiDAR scenes.
- We demonstrate the effectiveness of the generated LiDAR scenes for training segmentation models, highlighting Spiral's potential for generative data augmentation.
:books: Citation
If you find this work helpful for your research, please kindly consider citing our paper:
@inproceedings{zhu2025spiral,
title = {Spiral: Semantic-Aware Progressive LiDAR Scene Generation and Understanding},
author = {Zhu, Dekai and Hu, Yixuan and Liu, Youquan and Lu, Dongyue and Kong, Lingdong and Ilic, Slobodan},
booktitle = {The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year = {2025}
}
Updates
- [11/2025] - The code for Spiral is released. :rocket:
- [10/2025] - The project page is online. :rocket:
- [09/2025] - This work has been accepted to NeurIPS 2025.
:gear: Installation
For details related to installation and environment setups, please run:
conda env create -f environment.yaml
conda activate spiral
If you are stuck with an endless installation, try:
mamba env create -f environment.yaml
conda activate spiral
:hotsprings: Data Preparation
We use the official SemanticKITTI API to preprocess the data by projecting the LiDAR data from Cartesian coordinates into range images. You can download the preprocessed data here. :hugs:
:rocket: Getting Started
First, specify the data_path in utils/option.py to point to the directory of the preprocessed data. Then simply run:
python train.py
to start the training.
Acknowledgements
This work is developed based on the R2DM codebase.
