devika icon indicating copy to clipboard operation
devika copied to clipboard

Using devika on Docker Compose With External Ollama Server

Open hqnicolas opened this issue 10 months ago • 7 comments

congratulations, this devika project is an amazing piece of art!

All changes made to hqnicolas devika

Remove the Ollama server from docker compose

EDIT: docker-compose.yaml

version: "3.9"

services:
  devika-backend-engine:
    build:
      context: .
      dockerfile: devika.dockerfile
    expose:
      - 1337
    ports:
      - 1337:1337
    environment:
      - OLLAMA_HOST=http://192.168.0.21:11434
    healthcheck:
      test: ["CMD-SHELL", "curl -f http://localhost:1337/ || exit 1"]
      interval: 5s
      timeout: 30s
      retries: 5
      start_period: 30s
    volumes:
      - devika-backend-dbstore:/home/nonroot/devika/db
    networks:
      - devika-subnetwork

  devika-frontend-app:
    build:
      context: .
      dockerfile: app.dockerfile
      args:
        - VITE_API_BASE_URL=http://127.0.0.1:1337
    depends_on:
      - devika-backend-engine
    expose:
      - 3000
    ports:
      - 3000:3000
    networks:
      - devika-subnetwork

networks:
  devika-subnetwork:

volumes:
  devika-backend-dbstore:

Stop Messing with user on docker compose!

EDIT: devika.dockerfile

FROM debian:12

# setting up os env
USER root
WORKDIR /home/nonroot/devika

ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1

# setting up python3
RUN apt-get update && apt-get upgrade
RUN apt-get install -y build-essential software-properties-common curl sudo wget git
RUN apt-get install -y python3 python3-pip
RUN curl -fsSL https://astral.sh/uv/install.sh | sudo -E bash -
RUN $HOME/.cargo/bin/uv venv
ENV PATH="/home/nonroot/devika/.venv/bin:$HOME/.cargo/bin:$PATH"

# copy devika python engine only
RUN $HOME/.cargo/bin/uv venv
COPY requirements.txt /home/nonroot/devika/
RUN UV_HTTP_TIMEOUT=100000 $HOME/.cargo/bin/uv pip install -r requirements.txt 
RUN playwright install --with-deps chromium

COPY src /home/nonroot/devika/src
COPY config.toml /home/nonroot/devika/
COPY devika.py /home/nonroot/devika/
RUN chown -R root:root /home/nonroot/devika

USER root
WORKDIR /home/nonroot/devika
ENV PATH="/home/nonroot/devika/.venv/bin:$HOME/.cargo/bin:$PATH"
RUN mkdir /home/nonroot/devika/db

ENTRYPOINT [ "python3", "-m", "devika" ]

Make Sure that your Ollama server have this Models:

  • openchat:7b-v3.5-1210-q5_K_M (4.8GB)
  • mistral-openorca:7b-q5_K_M (4.8GB)
  • qwen:14b-chat-v1.5-q4_K_M (8.6GB)

Put your Bing API From BING = "https://api.bing.microsoft.com/v7.0/search"

hqnicolas avatar Mar 29 '24 23:03 hqnicolas

Captura de tela de 2024-03-29 20-59-14 Amazing Work! Thank you!

hqnicolas avatar Mar 29 '24 23:03 hqnicolas

Great one, can it work on a M1?

MalteBoehm avatar Mar 30 '24 09:03 MalteBoehm

can it work on a M1?

@MalteBoehm The only ARM that I test it is RK3566, and it works... the Ollama was external on RX7800XT desktop

hqnicolas avatar Mar 30 '24 13:03 hqnicolas

Great one, can it work on a M1?

I just tested it with a M1, and it does work. Took me some time to set it up, but after the setup it worked great.

ItsNeil17 avatar Mar 31 '24 10:03 ItsNeil17

image not able to create project it says devika inactive

janvi2021 avatar Apr 01 '24 09:04 janvi2021

image not able to create project it says devika inactive

Same thing happening with me as well. After reselecting the project option the agent gets active but not being able to generate anything corresponding the request given by me. I ran that couple of time but nothing happening.

subhajit20 avatar Apr 01 '24 15:04 subhajit20

@subhajit20 and @janvi2021

The stitionai/devika works fine for me today it was fixed on today's version I will fork it and apply the docker compose to external ollama https://github.com/hqnicolas/devika

hqnicolas avatar Apr 01 '24 22:04 hqnicolas

@hqnicolas thank you for the last update. Unfortunately I have the same symptoms as @subhajit20 and @janvi2021 with the last push (web page is UP so the front end is OK, but backend is not "fully" running. NOTE : is there a specific reason to get openchat, mistral and xxx model on external ollama side? I did'nt saw theses LLM in config files (maybe it is the reason that the backend is not working...). Unfortunately there is no debug on the backend container... It seems that the "healthcheck" on port 1337 is not UP (container "not healthy" but I can't see why the service is not working :(

sebj84 avatar Apr 02 '24 19:04 sebj84

@sebj84 are you in ollama? Latest Devika Version was Working with Ollama, I have Freezed it on my Repo: git push https://github.com/hqnicolas/devika cd devika create config file edit the docker compose to ollama url sudo docker compose build sudo docker compose up

hqnicolas avatar Apr 02 '24 20:04 hqnicolas