mixture-of-experts topic

List mixture-of-experts repositories

MoE-LLaVA

1.9k
Stars
121
Forks
Watchers

Mixture-of-Experts for Large Vision-Language Models

PETL_AST

35
Stars
3
Forks
Watchers

This is the official repository of the papers "Parameter-Efficient Transfer Learning of Audio Spectrogram Transformers" and "Efficient Fine-tuning of Audio Spectrogram Transformers via Soft Mixture of...

MC-SMoE

54
Stars
7
Forks
Watchers

[ICLR 2024 Spotlight] Code for the paper "Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy"

leeroo_orchestrator

50
Stars
4
Forks
Watchers

The implementation of "Leeroo Orchestrator: Elevating LLMs Performance Through Model Integration"

inferflow

235
Stars
24
Forks
Watchers

Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).

Chinese-Mixtral

582
Stars
44
Forks
Watchers

中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)

RealCompo

104
Stars
3
Forks
Watchers

[NeurIPS 2024] RealCompo: Balancing Realism and Compositionality Improves Text-to-Image Diffusion Models

classifier

15
Stars
4
Forks
Watchers

Machine learning code, derivatives calculation and optimization algorithms developed during the Machine Learning course at Universidade de Sao Paulo. All codes in Python, NumPy and Matplotlib with exa...

makeMoE

525
Stars
48
Forks
Watchers

From scratch implementation of a sparse mixture of experts language model inspired by Andrej Karpathy's makemore :)

mergoo

398
Stars
25
Forks
Watchers

A library for easily merging multiple LLM experts, and efficiently train the merged LLM.