mixture-of-experts topic

List mixture-of-experts repositories
trafficstars

MoEL

71
Stars
14
Forks
Watchers

MoEL: Mixture of Empathetic Listeners

MIX-GAN

19
Stars
1
Forks
Watchers

Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*

MMoEEx-MTL

30
Stars
3
Forks
Watchers

PyTorch Implementation of the Multi-gate Mixture-of-Experts with Exclusivity (MMoEEx)

egobox

48
Stars
1
Forks
Watchers

Efficient global optimization toolbox in Rust: bayesian optimization, mixture of gaussian processes, sampling methods

LLaMA-Factory

61.5k
Stars
7.4k
Forks
Watchers

Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)

soft-moe-pytorch

211
Stars
8
Forks
Watchers

Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch

st-moe-pytorch

237
Stars
21
Forks
Watchers

Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch

mimo

16
Stars
4
Forks
Watchers

A toolbox for inference of mixture models

mixtral-offloading

2.3k
Stars
228
Forks
Watchers

Run Mixtral-8x7B models in Colab or consumer desktops

awesome-adaptive-computation

119
Stars
7
Forks
Watchers

A curated reading list of research in Adaptive Computation, Dynamic Compute & Mixture of Experts (MoE).