local-ai topic
anything-llm
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, and more.
Memory-Cache
MemoryCache is an experimental development project to turn a local desktop environment into an on-device AI agent
kobold_assistant
Like ChatGPT's voice conversations with an AI, but entirely offline/private/trade-secret-friendly, using local AI models such as LLama 2 and Whisper
ollama-telegram
🦙 Ollama Telegram bot, with advanced configuration
maid
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
mlx-vlm
MLX-VLM is a package for running Vision LLMs locally on your Mac using MLX.
maid_llm
maid_llm is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid)
sibila
Extract structured data from local or remote LLM models
hollama
A minimal web-UI for talking to Ollama servers
shinkai-apps
Shinkai is a two click install AI manager (Ollama compatible for Windows, Mac and Linux). It lets you download/use AI models, RAG, and performs actions for you with tooling (very soon).