Frank Odom

Results 9 repositories owned by Frank Odom

fft-conv-pytorch

449
Stars
53
Forks
Watchers

Implementation of 1D, 2D, and 3D FFT convolutions in PyTorch. Much faster than direct convolutions for large kernel sizes.

clip-text-decoder

87
Stars
3
Forks
Watchers

Generate text captions for images from their embeddings.

transformer-from-scratch

89
Stars
19
Forks
Watchers

Code implementation from my blog post: https://fkodom.substack.com/p/transformers-from-scratch-in-pytorch

yet-another-retnet

101
Stars
15
Forks
Watchers

A simple but robust PyTorch implementation of RetNet from "Retentive Network: A Successor to Transformer for Large Language Models" (https://arxiv.org/pdf/2307.08621.pdf)

soft-mixture-of-experts

60
Stars
4
Forks
Watchers

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)

dilated-attention-pytorch

46
Stars
10
Forks
Watchers

(Unofficial) Implementation of dilated attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens" (https://arxiv.org/abs/2307.02486)

grouped-query-attention-pytorch

77
Stars
4
Forks
Watchers

(Unofficial) PyTorch implementation of grouped-query attention (GQA) from "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints" (https://arxiv.org/pdf/2305.13245.pdf)

semantle

15
Stars
1
Forks
Watchers

Fastest 'Semantle' solver this side of the Mississippi.