Zhuang Intelligent Processing Lab

Results 12 repositories owned by Zhuang Intelligent Processing Lab
trafficstars

LITv2

215
Stars
12
Forks
Watchers

[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "Fast Vision Transformers with HiLo Attention"

LIT

86
Stars
10
Forks
Watchers

[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"

EcoFormer

66
Stars
1
Forks
Watchers

[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "EcoFormer: Energy-Saving Attention with Linear Complexity"

FASeg

54
Stars
2
Forks
Watchers

[CVPR 2023] This is the official PyTorch implementation for "Dynamic Focus-aware Positional Queries for Semantic Segmentation".

HVT

30
Stars
5
Forks
Watchers

[ICCV 2021] Official implementation of "Scalable Vision Transformers with Hierarchical Pooling"

Mesa

103
Stars
7
Forks
Watchers

This is the official PyTorch implementation for "Mesa: A Memory-saving Training Framework for Transformers".

QTool

51
Stars
14
Forks
Watchers

Collections of model quantization algorithms. Any issues, please contact Peng Chen ([email protected])

SAQ

31
Stars
4
Forks
Watchers

This is the official PyTorch implementation for "Sharpness-aware Quantization for Deep Neural Networks".

SPViT

101
Stars
14
Forks
Watchers

[TPAMI 2024] This is the official repository for our paper: ''Pruning Self-attentions into Convolutional Layers in Single Path''.

SPT

61
Stars
2
Forks
Watchers

[ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.