HKU NLP Group
HKU NLP Group
HumanPrompt
A framework for human-readable prompt-based method with large language models. Specially designed for researchers. (Deprecated, check out LangChain for better usage!)
reparam-discrete-diffusion
Reparameterized Discrete Diffusion Models for Text Generation
efficient-attention
[EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling
icl-ceil
[ICML 2023] Code for our paper “Compositional Exemplars for In-context Learning”.
ProGen
[EMNLP-2022 Findings] Code for paper “ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback”.
diffusion-of-thoughts
Code for the paper "Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models"
ChunkLlama
Data and code for our paper "Training-Free Long-Context Scaling of Large Language Models"
subgoal-theorem-prover
Code for the paper "Decomposing the Enigma: Subgoal-based Demonstration Learning for Formal Theorem Proving"
SymGen
[EMNLP'23] Code for Generating Data for Symbolic Language with Large Language Models