mixture-of-experts topic
MoEL
MoEL: Mixture of Empathetic Listeners
MIX-GAN
Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*
MMoEEx-MTL
PyTorch Implementation of the Multi-gate Mixture-of-Experts with Exclusivity (MMoEEx)
egobox
Efficient global optimization toolbox in Rust: bayesian optimization, mixture of gaussian processes, sampling methods
LLaMA-Factory
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
soft-moe-pytorch
Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch
st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
mimo
A toolbox for inference of mixture models
mixtral-offloading
Run Mixtral-8x7B models in Colab or consumer desktops
awesome-adaptive-computation
A curated reading list of research in Adaptive Computation, Dynamic Compute & Mixture of Experts (MoE).