PLSC
PLSC copied to clipboard
Paddle Large Scale Classification Tools,supports ArcFace, CosFace, PartialFC, Data Parallel + Model Parallel. Model includes ResNet, ViT, Swin, DeiT, CaiT, FaceViT, MoCo, MAE, ConvMAE, CAE.
Introduction
PLSC is an open source repo for a collection of Paddle Large Scale Classification Tools, which supports large-scale classification model pre-training as well as finetune for downstream tasks.
Top News 🔥
Update (2022-07-18): PLSC v2.3 is released, a new upgrade, more modular and highly extensible. Support more tasks, such as ViT, DeiT. The static
graph mode will no longer be maintained as of this release.
Update (2022-01-11): Supported NHWC data format of FP16 to improve 10% throughtput and decreased 30% GPU memory. It supported 92 million classes on single node 8 NVIDIA V100 (32G) and has high training throughtput. Supported best checkpoint save. And we released 18 pretrained models and PLSC v2.2.
Update (2021-12-11): Released Zhihu Technical Artical and Bilibili Open Class
Update (2021-10-10): Added FP16 training, improved throughtput and optimized GPU memory. It supported 60 million classes on single node 8 NVIDIA V100 (32G) and has high training throughtput.
Update (2021-09-10): This repository supported both static
mode and dynamic
mode to use paddlepaddle v2.2, which supported 48 million classes on single node 8 NVIDIA V100 (32G). It added PartialFC, SparseMomentum, and ArcFace, CosFace (we refer to MarginLoss). Backbone includes IResNet and MobileNet.
Installation
See Installation instructions.
Getting Started
See Quick Run Recognition for the basic usage of PLSC.
Tutorials
See more tutorials.
Documentation
See documentation for the usage of more APIs or modules.
Model Zoo
To download more useful pre-trained models see model zoo.
License
This project is released under the Apache 2.0 license.
Citation
@misc{plsc,
title={PLSC: An Easy-to-use and High-Performance Large Scale Classification Tool},
author={PLSC Contributors},
howpublished = {\url{https://github.com/PaddlePaddle/PLSC}},
year={2022}
}