ollama-gui
ollama-gui copied to clipboard
A Web Interface for chatting with your local LLMs via the ollama API
Ollama GUI: Web Interface for chatting with your local LLMs.
Ollama GUI is a web interface for ollama.ai, a tool that enables running Large Language Models (LLMs) on your local machine.
🛠 Installation
Prerequisites
- Download and install ollama CLI.
- Download and install yarn and node
ollama pull <model-name>
ollama serve
Getting Started
- Clone the repository and start the development server.
git clone https://github.com/HelgeSverre/ollama-gui.git
cd ollama-gui
yarn install
yarn dev
Or use the hosted web version, by running ollama with the following origin command (docs)
OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve
Models
For convenience and copy-pastability
, here is a table of interesting models you might want to try out.
For a complete list of models Ollama supports, go to ollama.ai/library.
📋 To-Do List
- [x] Properly format newlines in the chat message (PHP-land has
nl2br
basically want the same thing) - [x] Store chat history using IndexedDB locally
- [x] Cleanup the code, I made a mess of it for the sake of speed and getting something out the door.
- [x] Add markdown parsing lib
- [ ] Allow browsing and installation of available models (library)
- [ ] Ensure mobile responsiveness (non-prioritized use-case atm.)
- [ ] Add file uploads with OCR and stuff.
🛠 Built With
- Ollama.ai - CLI tool for models.
- LangUI
- Vue.js
- Vite
- Tailwind CSS
- VueUse
- @tabler/icons-vue
📝 License
Licensed under the MIT License. See the LICENSE.md file for details.