Phil Wang

Results 229 repositories owned by Phil Wang

compressive-transformer-pytorch

154
Stars
20
Forks
Watchers

Pytorch implementation of Compressive Transformers, from Deepmind

long-short-transformer

118
Stars
12
Forks
Watchers

Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch

linear-attention-transformer

614
Stars
62
Forks
Watchers

Transformer based on a variant of attention that is linear complexity in respect to sequence length

tab-transformer-pytorch

713
Stars
91
Forks
Watchers

Implementation of TabTransformer, attention network for tabular data, in Pytorch

routing-transformer

271
Stars
29
Forks
Watchers

Fully featured implementation of Routing Transformer

sinkhorn-transformer

247
Stars
21
Forks
Watchers

Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention

conformer

338
Stars
54
Forks
Watchers

Implementation of the convolutional module from the Conformer paper, for use in Transformers

x-unet

252
Stars
17
Forks
Watchers

Implementation of a U-net complete with efficient attention as well as the latest research findings

glom-pytorch

189
Stars
27
Forks
Watchers

An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates concepts from neural fields, top-down-bottom-up processing, and attention (consensus between columns), for emergent...

Adan-pytorch

245
Stars
9
Forks
Watchers

Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch