language-modeling topic
comparatively-finetuning-bert
Comparatively fine-tuning pretrained BERT models on downstream, text classification tasks with different architectural configurations in PyTorch.
KenLM-training
Training an n-gram based Language Model using KenLM toolkit for Deep Speech 2
unif
基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域
pretraining-for-language-understanding
Pre-training of Language Models for Language Understanding
fed-att
Attentive Federated Learning for Private NLM
rome
Locating and editing factual associations in GPT (NeurIPS 2022)
referit3d
Code accompanying our ECCV-2020 paper on 3D Neural Listeners.
UDSMProt
Protein sequence classification with self-supervised pretraining
gated-state-spaces-pytorch
Implementation of Gated State Spaces, from the paper "Long Range Language Modeling via Gated State Spaces", in Pytorch
incontext-learning
Experiments and code to generate the GINC small-scale in-context learning dataset from "An Explanation for In-context Learning as Implicit Bayesian Inference"