Belullama icon indicating copy to clipboard operation
Belullama copied to clipboard

GPU Usage

Open Boundzero opened this issue 1 year ago • 2 comments

So I gave the it a lengthy prompt to see if I can tell if its using my CPU or GPU. Now how do I tell if its using the GPU? When I look at my CPU usage while its writing the response I am seeing my CPU usage go to between 51 and 54%..

Boundzero avatar Aug 29 '24 01:08 Boundzero

To determine if Belullama is using your GPU instead of your CPU, you can follow these steps:

  1. Monitor GPU usage:

    • For NVIDIA GPUs, use the nvidia-smi command in a terminal:
      watch -n 1 nvidia-smi
      
    • This will show GPU usage, memory consumption, and processes using the GPU.
    • Look for Ollama or related processes in the list.
  2. Check Ollama logs:

    • Ollama might output information about GPU usage in its logs.
    • Check the logs with:
      sudo journalctl -u ollama -f
      
  3. Verify Ollama's GPU detection:

    • Run:
      ollama run gpu-check
      
    • This should indicate if Ollama detects and can use the GPU.
  4. Monitor system resources:

    • Use tools like htop or top to monitor CPU usage.
    • If CPU usage is high while Ollama is running, it might be using the CPU instead of GPU.
  5. Check Ollama version:

    • Ensure you're using a GPU-compatible version of Ollama.
    • Run:
      ollama --version
      
  6. Verify CUDA installation:

    • For NVIDIA GPUs, ensure CUDA is properly installed:
      nvidia-smi
      nvcc --version
      

If you're seeing high CPU usage (51-54%) during responses, it's possible that Ollama is still using the CPU. This could be due to:

  1. Incorrect GPU setup in Ollama
  2. Model not optimized for GPU use
  3. GPU drivers or CUDA not properly installed or recognized

To address this:

  1. Double-check the Belullama GPU installation process
  2. Ensure your GPU drivers and CUDA are up-to-date
  3. Try running a known GPU-compatible model explicitly

If issues persist, you might need to review the Belullama GPU installation script or check for any error messages during the setup process.

ai-joe-git avatar Sep 05 '24 22:09 ai-joe-git

So I tried your commands none work, in case it doesnt even pick up that ollama is installed even though I can run it. the -smi commands all return not found. I can try reinstalling it but can you perhaps put up a better install guide for Nvidia beta test from scratch as had your original docker installed then installed the nvidia one..

Boundzero avatar Sep 06 '24 00:09 Boundzero