Liyuan Liu
Liyuan Liu
LM-LSTM-CRF
Empower Sequence Labeling with Task-Aware Language Model
RAdam
On the Variance of the Adaptive Learning Rate and Beyond
Transformer-Clinic
Understanding the Difficulty of Training Transformers
LD-Net
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
LightNER
Inference with state-of-the-art models (pre-trained by LD-Net / AutoNER / VanillaNER / ...)
Torch-Scope
A Toolkit for Training, Tracking, Saving Models and Syncing Results
ArabicNER
Arabic NER system with a strong performance
ReHession
Heterogeneous Supervision for Relation Extraction: A Representation Learning Approach