vLLM

Results 4 repositories owned by vLLM

vllm

66.6k
Stars
12.3k
Forks
66.6k
Watchers

A high-throughput and memory-efficient inference and serving engine for LLMs

aibrix

3.3k
Stars
310
Forks
Watchers

Cost-efficient and pluggable Infrastructure components for GenAI inference

vllm-omni

1.9k
Stars
241
Forks
1.9k
Watchers

A framework for efficient model inference with omni-modality models

semantic-router

2.6k
Stars
368
Forks
2.6k
Watchers

System Level Intelligent Router for Mixture-of-Models