PyOMlx icon indicating copy to clipboard operation
PyOMlx copied to clipboard

A wannabe Ollama equivalent for Apple MlX models

PyOMlx

Serve MlX models locally!

Motivation

Inspired by Ollama project, I wanted to have a similar experience for serving MLX models. Mlx from ml-explore is a new framework for running ML models in Apple Silicon. This app is intended to be used along with PyOllaMx

I'm using these in my day to day workflow and I intend to keep develop these for my use and benifit.

If you find this valuable, feel free to use it and contribute to this project as well. Please ⭐️ this repo to show your support and make my day!

I'm planning on work on next items on this roadmap.md. Feel free to comment your thoughts (if any) and influence my work (if interested)

MacOS DMGs are available in Releases page

How to use

  1. Download & Install the PyOMlx MacOS App

  2. Run the app

  3. You will now see the application running in the system tray. Use PyOllaMx to chat with MLX models seamlessly

Demo

https://github.com/kspviswa/pyOllaMx/assets/7476271/dc686d60-182d-4f90-a771-9c1df1c70b5c

Features

v0.0.3

  • Updated mlx-lm to support Gemma models

v0.0.1

  • Automatically discover & serve MLX models that are downloaded from MLX Huggingface community.
  • Easy start-up / shutdown via MacOS App
  • System tray indication