LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, tr...



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

LocalAI Docker hub LocalAI Quay.io

Follow LocalAI_API Join LocalAI Discord Community

go-skynet%2FLocalAI | Trendshift

:bulb: Get help - ❓FAQ πŸ’­Discussions :speech_balloon: Discord :book: Documentation website

πŸ’» Quickstart πŸ–ΌοΈ Models πŸš€ Roadmap πŸ₯½ Demo 🌍 Explorer πŸ›« Examples

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by Ettore Di Giacinto.

screen

Run the installer script:

curl https://localai.io/install.sh | sh

Or run with docker:

# CPU only image:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu

# Nvidia GPU:
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12

# CPU and GPU image (bigger size):
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

# AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

To load models:

# From the model gallery (see available models with `local-ai models list`, in the WebUI from the model tab, or visiting https://models.localai.io)
local-ai run llama-3.2-1b-instruct:q4_k_m
# Start LocalAI with the phi-2 model directly from huggingface
local-ai run huggingface://TheBloke/phi-2-GGUF/phi-2.Q8_0.gguf
# Install and run a model from the Ollama OCI registry
local-ai run ollama://gemma:2b
# Run a model from a configuration file
local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
# Install and run a model from a standard OCI registry (e.g., Docker Hub)
local-ai run oci://localai/phi-2:latest

πŸ’» Getting started

πŸ“° Latest project news

  • Dec 2024: stablediffusion.cpp backend (ggml) added ( https://github.com/mudler/LocalAI/pull/4289 )
  • Nov 2024: Bark.cpp backend added ( https://github.com/mudler/LocalAI/pull/4287 )
  • Nov 2024: Voice activity detection models (VAD) added to the API: https://github.com/mudler/LocalAI/pull/4204
  • Oct 2024: examples moved to LocalAI-examples
  • Aug 2024: πŸ†• FLUX-1, P2P Explorer
  • July 2024: πŸ”₯πŸ”₯ πŸ†• P2P Dashboard, LocalAI Federated mode and AI Swarms: https://github.com/mudler/LocalAI/pull/2723
  • June 2024: πŸ†• You can browse now the model gallery without LocalAI! Check out https://models.localai.io
  • June 2024: Support for models from OCI registries: https://github.com/mudler/LocalAI/pull/2628
  • May 2024: πŸ”₯πŸ”₯ Decentralized P2P llama.cpp: https://github.com/mudler/LocalAI/pull/2343 (peer2peer llama.cpp!) πŸ‘‰ Docs https://localai.io/features/distribute/
  • May 2024: πŸ”₯πŸ”₯ Openvoice: https://github.com/mudler/LocalAI/pull/2334
  • May 2024: πŸ†• Function calls without grammars and mixed mode: https://github.com/mudler/LocalAI/pull/2328
  • May 2024: πŸ”₯πŸ”₯ Distributed inferencing: https://github.com/mudler/LocalAI/pull/2324
  • May 2024: Chat, TTS, and Image generation in the WebUI: https://github.com/mudler/LocalAI/pull/2222
  • April 2024: Reranker API: https://github.com/mudler/LocalAI/pull/2121

Roadmap items: List of issues

πŸ”₯πŸ”₯ Hot topics (looking for help):

  • Multimodal with vLLM and Video understanding: https://github.com/mudler/LocalAI/pull/3729
  • Realtime API https://github.com/mudler/LocalAI/issues/3714
  • πŸ”₯πŸ”₯ Distributed, P2P Global community pools: https://github.com/mudler/LocalAI/issues/3113
  • WebUI improvements: https://github.com/mudler/LocalAI/issues/2156
  • Backends v2: https://github.com/mudler/LocalAI/issues/1126
  • Improving UX v2: https://github.com/mudler/LocalAI/issues/1373
  • Assistant API: https://github.com/mudler/LocalAI/issues/1273
  • Moderation endpoint: https://github.com/mudler/LocalAI/issues/999
  • Vulkan: https://github.com/mudler/LocalAI/issues/1647
  • Anthropic API: https://github.com/mudler/LocalAI/issues/1808

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22

πŸš€ Features

πŸ’» Usage

Check out the Getting started section in our documentation.

πŸ”— Community and integrations

Build and deploy custom containers:

  • https://github.com/sozercan/aikit

WebUIs:

  • https://github.com/Jirubizu/localai-admin
  • https://github.com/go-skynet/LocalAI-frontend
  • QA-Pilot(An interactive chat project that leverages LocalAI LLMs for rapid understanding and navigation of GitHub code repository) https://github.com/reid41/QA-Pilot

Model galleries

  • https://github.com/go-skynet/model-gallery

Other:

  • Helm chart https://github.com/go-skynet/helm-charts
  • VSCode extension https://github.com/badgooooor/localai-vscode-plugin
  • Terminal utility https://github.com/djcopley/ShellOracle
  • Local Smart assistant https://github.com/mudler/LocalAGI
  • Home Assistant https://github.com/sammcj/homeassistant-localai / https://github.com/drndos/hass-openai-custom-conversation / https://github.com/valentinfrlch/ha-gpt4vision
  • Discord bot https://github.com/mudler/LocalAGI/tree/main/examples/discord
  • Slack bot https://github.com/mudler/LocalAGI/tree/main/examples/slack
  • Shell-Pilot(Interact with LLM using LocalAI models via pure shell scripts on your Linux or MacOS system) https://github.com/reid41/shell-pilot
  • Telegram bot https://github.com/mudler/LocalAI/tree/master/examples/telegram-bot
  • Another Telegram Bot https://github.com/JackBekket/Hellper
  • Auto-documentation https://github.com/JackBekket/Reflexia
  • Github bot which answer on issues, with code and documentation as context https://github.com/JackBekket/GitHelper
  • Github Actions: https://github.com/marketplace/actions/start-localai
  • Examples: https://github.com/mudler/LocalAI/tree/master/examples/

πŸ”— Resources

:book: πŸŽ₯ Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❀️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project covering CI expenses, and our Sponsor list:


🌟 Star history

LocalAI Star history Chart

πŸ“– License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto [email protected]

πŸ™‡ Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

  • llama.cpp
  • https://github.com/tatsu-lab/stanford_alpaca
  • https://github.com/cornelk/llama-go for the initial ideas
  • https://github.com/antimatter15/alpaca.cpp
  • https://github.com/EdVince/Stable-Diffusion-NCNN
  • https://github.com/ggerganov/whisper.cpp
  • https://github.com/rhasspy/piper

πŸ€— Contributors

This is a community project, a special thanks to our contributors! πŸ€—