Angular-Skeleton-Encoding
Angular-Skeleton-Encoding copied to clipboard
Sourcecode for the TNNLS 2022 paper: Fusing Higher-Order Features in Graph Neural Networks for Skeleton-Based Action Recognition
Angular Encoding for Skeleton-Based Action Recognition
Overview
PyTorch implementation of "TNNLS 2022: Fusing Higher-Order Features in Graph Neural Networks for Skeleton-Based Action Recognition". (https://arxiv.org/pdf/2105.01563.pdf).
Angular Features

Network Architecture

Dependencies
- Python >= 3.6
- PyTorch >= 1.2.0
- NVIDIA Apex (auto mixed precision training)
- PyYAML, tqdm, tensorboardX, matplotlib, seaborn
Data Preparation
Download Datasets
There are 2 datasets to download:
- NTU RGB+D 60 Skeleton
- NTU RGB+D 120 Skeleton
Request the datasets here: http://rose1.ntu.edu.sg/Datasets/actionRecognition.asp
Data Preprocessing
Directory Structure
Put downloaded data into the following directory structure:
- data/
- nturgbd_raw/
- nturgb+d_skeletons/ # from `nturgbd_skeletons_s001_to_s017.zip`
...
- nturgb+d_skeletons120/ # from `nturgbd_skeletons_s018_to_s032.zip`
Generating Data
-
cd data_gen
-
python3 ntu_gendata.py
-
python3 ntu120_gendata.py
- This can take hours. Better CPUs lead to much faster processing.
Training
bash train.sh
Testing
bash test.sh
Acknowledgements
This repo is based on
Thanks to the original authors for their work!
The flat icon is from Freepik.
Citation
Please cite this work if you find it useful:
@article{DBLP:journals/corr/abs-2105-01563,
author = {Zhenyue Qin and Yang Liu and Pan Ji and Dongwoo Kim and Lei Wang and
Bob McKay and Saeed Anwar and Tom Gedeon},
title = {Fusing Higher-Order Features in Graph Neural Networks for Skeleton-based Action Recognition},
journal = {IEEE Transactions on Neural Networks and Learning Systems (TNNLS)},
year = {2022}
}
Contact
If you have further question, please email [email protected]
or [email protected]
.