open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

feat: Add docker and docker-compose support

Open sammcj opened this issue 2 years ago • 3 comments

Adds the ability to run open-interpreter in a container.

Wrote it for my own use, but thought I'd submit a PR if you want it, if not - feel free to close.

Example docker-compose also provided.

  • fixes #72
  • relates to #188

sammcj avatar Sep 09 '23 00:09 sammcj

Hi @sammcj, thanks for the contribution— this looks fantastic and perfectly implemented. may I ask what your use for containerizing it was?

I can imagine it being appealing to let it run rampant in a container to solve something— just feeling safer that it wont damage my system, especially for the less-aligned models

KillianLucas avatar Sep 09 '23 10:09 KillianLucas

Hey @KillianLucas no worries at all!

It was two fold:

  1. I saw the project and was like "oh I want to try that" then "oh damn - I'd have to install language dependencies on my server to do so".

I don't generally install anything onto the underlying OS of any servers I run, I always contain software to keep it portable, isolated and disposable. Many modern operating systems don't even have a R/W base OS (e.g. Fedora CoreOS) so you have to run things from containers anyway.

  1. I like to give back to open source projects when my time allows - so when I see a potentially useful project that needs a PR I know I can do I try to submit one without demanding it gets merged etc... (no skin off my back if the authors don't want or agree with it), so in this case I figured even if I don't use it maybe someone else will.

sammcj avatar Sep 09 '23 22:09 sammcj

This is a very cool implementation, though it's adding a lot of potential debt for the ability to use local LLMs within the container.

A simpler and highly usable solution out of the box with openai API would be to simply create a PR that:

  1. defined a Dockerfile in the root directory of the repository
  2. added a README.md entry for "Running Open-Interpreter in Docker"

Dockerfile:

# Use the official Python 3.10 image as a base image
FROM python:3.10

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# Install 'open-interpreter' using pip
RUN pip install open-interpreter

# Default command
CMD ["interpreter"]

Make a sandbox dir in the current working directory

mkdir sandbox

Build the docker image

docker build -t openai-interpreter:latest .

(Optional) Create a .env file and add any environment vars, e.g.

OPENAI_API_KEY=<Your key>

Run open interpreter in docker

docker run -it -v ./sandbox:/sandbox openai-interpreter:latest

Output:

leif@DESKTOP-EN4VCEP:/mnt/c/Users/leifkt/killian/open-interpreter-docker$ docker run -it --env-file .env -v ./sandbox:/sandbox openai-interpreter:latest

▌ Model set to GPT-4

Open Interpreter will require approval before running code.

Use interpreter -y to bypass this.

Press CTRL-C to exit.

>

This would allow us to publish and version the docker image to a public container registry, tied to the specific tag of the open interpreter version, and allow anyone with Docker installed to immediately play with open interpreter locally in a safe sandbox.

A user wouldn't need to have pip, or clone the repo, simply:

docker pull open-interpreter:<version>
docker run -it openai-interpreter:<version>

leifktaylor avatar Oct 15 '23 21:10 leifktaylor

Sorry for the extreme delay on this Sam! Very clever engineering around the architecture of Open Interpreter at the time, ideally we can make the Docker experience even simpler. Will close this PR in favor of a simpler dockerfile as Leif mentioned.

KillianLucas avatar Mar 28 '24 02:03 KillianLucas