A.I Joe

Results 9 comments of A.I Joe

Hello! Thank you so much for your interest in beta testing Belullama! We're thrilled to have you on board, especially with your impressive setup - a Windows 11 system with...

Please try the GPU supported version: To install the GPU version of Belullama, which includes Ollama, Open WebUI, and Automatic1111, use the following command: ```bash curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer_gpu.sh | sudo...

Hello Andrei, Thank you for your interest in testing Belullama with your RTX A2000 GPU. We appreciate your enthusiasm and willingness to contribute to the project! We're excited to inform...

Please try the GPU supported version: To install the GPU version of Belullama, which includes Ollama, Open WebUI, and Automatic1111, use the following command: ```bash curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer_gpu.sh | sudo...

Next updates will feature: - Support for existing Ollama installations - Option to use custom LLM directories - Choice between using existing Ollama or installing a new instance - Improved...

> Thanks for your contribution. We will test it. If the app works well and is of good enough quality. We will merge it. Thank you for considering Belullama for...

To determine if Belullama is using your GPU instead of your CPU, you can follow these steps: 1. Monitor GPU usage: - For NVIDIA GPUs, use the `nvidia-smi` command in...

https://github.com/ai-joe-git/Belullama > ### App Information > * Name: Ollama > * Short Description: App for running LLM > * Official Website: https://ollama.com > * GitHub Repository: https://github.com/ollama/ollama > * Docker...