HKU NLP Group

Results 10 repositories owned by HKU NLP Group

HumanPrompt

125
Stars
8
Forks
Watchers

A framework for human-readable prompt-based method with large language models. Specially designed for researchers. (Deprecated, check out LangChain for better usage!)

reparam-discrete-diffusion

87
Stars
2
Forks
Watchers

Reparameterized Discrete Diffusion Models for Text Generation

efficient-attention

76
Stars
3
Forks
Watchers

[EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling

icl-ceil

89
Stars
9
Forks
Watchers

[ICML 2023] Code for our paper “Compositional Exemplars for In-context Learning”.

ProGen

20
Stars
0
Forks
Watchers

[EMNLP-2022 Findings] Code for paper “ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback”.

diffusion-of-thoughts

48
Stars
1
Forks
Watchers

Code for the paper "Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models"

ChunkLlama

175
Stars
4
Forks
Watchers

Data and code for our paper "Training-Free Long-Context Scaling of Large Language Models"

RSA

39
Stars
2
Forks
Watchers

Retrieved Sequence Augmentation for Protein Representation Learning

subgoal-theorem-prover

17
Stars
0
Forks
Watchers

Code for the paper "Decomposing the Enigma: Subgoal-based Demonstration Learning for Formal Theorem Proving"

SymGen

16
Stars
1
Forks
Watchers

[EMNLP'23] Code for Generating Data for Symbolic Language with Large Language Models