gguf topic
node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
catai
Run AI ✨ assistant locally! with simple API for Node.js 🚀
LLM.swift
LLM.swift is a simple and readable library that allows you to interact with large language models locally with ease for macOS, iOS, watchOS, tvOS, and visionOS.
shady.ai
Making offline AI models accessible to all types of edge devices.
llama_ros
llama.cpp (GGUF LLMs) and llava.cpp (GGUF VLMs) for ROS 2
cog-models
A collection of cog models for use on Replicate
maid
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
blitz-embed
C++ inference wrappers for running blazing fast embedding services on your favourite serverless like AWS Lambda. By Prithivi Da, PRs welcome.
ImpAI
😈 ImpAI is an advanced role play app using large language and diffusion models.