model-inference topic
BentoML
The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
OpenLLM
Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
Awesome-EdgeAI
Resources of our survey paper "A Systematic Review of AI Deployment on Resource-Constrained Edge Devices: Challenges, Techniques, and Applications"
CLIP-API-service
CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Image_captioning
Генерация описаний к изображениям с помощью различных архитектур нейронных сетей
edge-tpu-silva
Streamlining the process for seamless execution of PyCoral in running TensorFlow Lite models on an Edge TPU USB.