Phil Wang

Results 225 repositories owned by Phil Wang

egnn-pytorch

387
Stars
65
Forks
Watchers

Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch

electra-pytorch

218
Stars
43
Forks
Watchers

A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch

siren-pytorch

452
Stars
49
Forks
Watchers

Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function

se3-transformer-pytorch

243
Stars
23
Forks
Watchers

Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.

En-transformer

205
Stars
28
Forks
Watchers

Implementation of E(n)-Transformer, which incorporates attention mechanisms into Welling's E(n)-Equivariant Graph Neural Network

FLASH-pytorch

334
Stars
23
Forks
Watchers

Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"

memory-efficient-attention-pytorch

342
Stars
32
Forks
Watchers

Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"

enformer-pytorch

399
Stars
69
Forks
Watchers

Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch

lightweight-gan

1.6k
Stars
219
Forks
Watchers

Implementation of 'lightweight' GAN, proposed in ICLR 2021, in Pytorch. High resolution image generations that can be trained within a day or two

halonet-pytorch

200
Stars
22
Forks
Watchers

Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones