Phil Wang

Results 225 repositories owned by Phil Wang

memory-compressed-attention

72
Stars
12
Forks
Watchers

Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"

omninet-pytorch

52
Stars
5
Forks
Watchers

Implementation of OmniNet, Omnidirectional Representations from Transformers, in Pytorch

memory-transformer-xl

47
Stars
9
Forks
Watchers

A variant of Transformer-XL where the memory is updated not with a queue, but with attention

panoptic-transformer

38
Stars
2
Forks
Watchers

Another attempt at a long-context / efficient transformer by me

memory-editable-transformer

35
Stars
1
Forks
Watchers

My explorations into editing the knowledge and memories of an attention network

tranception-pytorch

31
Stars
1
Forks
Watchers

Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction

genetic-algorithm-pytorch

29
Stars
4
Forks
Watchers

Toy genetic algorithm in Pytorch

speculative-decoding

160
Stars
14
Forks
Watchers

Explorations into some recent techniques surrounding speculative decoding

zorro-pytorch

92
Stars
6
Forks
Watchers

Implementation of Zorro, Masked Multimodal Transformer, in Pytorch