Open-Assistant icon indicating copy to clipboard operation
Open-Assistant copied to clipboard

Easier way to set up an instance

Open R7JANV1 opened this issue 2 years ago • 4 comments

Possibly the possibility of an all in one image. or the possibility of a helm chart?

R7JANV1 avatar Feb 10 '23 11:02 R7JANV1

When you say all-in-one, which services are you referencing from the compose file?

neonwatty avatar Feb 11 '23 13:02 neonwatty

All containers that are minimally necessary to run an instance. Or the possibility of configuring which containers should be used, maybe like Nextcloud AIO?

ghost avatar Feb 13 '23 06:02 ghost

At the moment there's no virtual assistant to run, the code is for devs and testers building the platform.

But you can spin up the Docker environment from the text at the top of docker-compose.yml. As long as you've got Docker installed it'll start everything that's there, a one-liner.

Also for dev, you now just type code . in the dir and it'll prompt you to install the plugins then switch to a container with all the tools you need to work on the codebase. Or you can run it in a GitHub codespace by clicking a link on the website.

When we have an assistant to run it'll be the same thing. It'll probably have a thing you double click to run it, but we don't have the assistant code yet so there's no point in that just yet

bitplane avatar Feb 13 '23 08:02 bitplane

For many who already have a running server with a hypervisor such as TrueNAS or Synology, it is almost impossible to get this great project up and running.

Maybe you can think about a single Docker container that can easily set up the Assistant with all its features.

So any environment that supports docker could be used to run this project on your own server.

ghost avatar Feb 21 '23 09:02 ghost

For many who already have a running server with a hypervisor such as TrueNAS or Synology, it is almost impossible to get this great project up and running.

Maybe you can think about a single Docker container that can easily set up the Assistant with all its features.

So any environment that supports docker could be used to run this project on your own server.

To be clear, there is currently no Assistant to run. There are various services which can be run in Docker containers. When the model is available, you will be able to run it from the inference server we are currently developing, also using Docker.

olliestanley avatar Feb 25 '23 00:02 olliestanley