Laplacian2Mesh
Laplacian2Mesh copied to clipboard
Laplacian2Mesh: Laplacian-Based Mesh Understanding

Laplacian2Mesh: Laplacian-Based Mesh Understanding
Project | Paper
This repository is the official PyTorch implementation of our paper, Laplacian2Mesh: Laplacian-Based Mesh Understanding.

News
Requirements
- python 3.7
- CUDA 11.3
- Pytorch 1.10.0
To install other python requirements:
pip install -r requirements.txt
Installation
clone this repo:
git clone https://github.com/QiujieDong/Laplacian2Mesh.git
cd Laplacian2Mesh
Fetch Data
This repo provides training scripts for classification and segmentation on the following datasets:
- SHREC-11
- manifold40
- humanbody
- coseg_aliens
- coseg_chairs
- coseg_vases
To download the preprocessed data, run
sh ./scripts/<DATASET_NAME>/get_data.sh
The
coseg_aliens
,coseg_chairs
, andcoseg_vases
are downloaded via the script ofcoseg_aliens
. This repo uses the originalManifold40
dataset without re-meshing via the Loop Subdivision.
Preprocessing
To get the input features by preprocessing:
sh ./scripts/<DATASET_NAME>/prepaer_data.sh
The operation of preprocessing is one-time.
Training
To train the model on the provided dataset(s) in this paper, run this command:
sh ./scripts/<DATASET_NAME>/train.sh
The training process is time-consuming, you can refer to DiffusionNet to optimize the code to speed up the training.
Evaluation
To evaluate the model on a dataset, run:
sh ./scripts/<DATASET_NAME>/test.sh
Visualize
After testing the segmentation network, there will be colored shapes in the visualization_result
directory.
Cite
If you find our work useful for your research, please consider citing the following papers :)
@article{dong2023laplacian2mesh,
title={Laplacian2mesh: Laplacian-based mesh understanding},
author={Dong, Qiujie and Wang, Zixiong and Li, Manyi and Gao, Junjie and Chen, Shuangmin and Shu, Zhenyu and Xin, Shiqing and Tu, Changhe and Wang, Wenping},
journal={IEEE Transactions on Visualization and Computer Graphics},
year={2023},
publisher={IEEE}
}