vLLM

Results 4 repositories owned by vLLM

vllm

57.1k
Stars
9.9k
Forks
Watchers

A high-throughput and memory-efficient inference and serving engine for LLMs

aibrix

3.3k
Stars
310
Forks
Watchers

Cost-efficient and pluggable Infrastructure components for GenAI inference

vllm-omni

1.7k
Stars
215
Forks
1.7k
Watchers

A framework for efficient model inference with omni-modality models

semantic-router

2.6k
Stars
368
Forks
2.6k
Watchers

System Level Intelligent Router for Mixture-of-Models