Open-Assistant
Open-Assistant copied to clipboard
Easier way to set up an instance
Possibly the possibility of an all in one image. or the possibility of a helm chart?
When you say all-in-one, which services are you referencing from the compose file?
All containers that are minimally necessary to run an instance. Or the possibility of configuring which containers should be used, maybe like Nextcloud AIO?
At the moment there's no virtual assistant to run, the code is for devs and testers building the platform.
But you can spin up the Docker environment from the text at the top of docker-compose.yml. As long as you've got Docker installed it'll start everything that's there, a one-liner.
Also for dev, you now just type code . in the dir and it'll prompt you to install the plugins then switch to a container with all the tools you need to work on the codebase. Or you can run it in a GitHub codespace by clicking a link on the website.
When we have an assistant to run it'll be the same thing. It'll probably have a thing you double click to run it, but we don't have the assistant code yet so there's no point in that just yet
For many who already have a running server with a hypervisor such as TrueNAS or Synology, it is almost impossible to get this great project up and running.
Maybe you can think about a single Docker container that can easily set up the Assistant with all its features.
So any environment that supports docker could be used to run this project on your own server.
For many who already have a running server with a hypervisor such as TrueNAS or Synology, it is almost impossible to get this great project up and running.
Maybe you can think about a single Docker container that can easily set up the Assistant with all its features.
So any environment that supports docker could be used to run this project on your own server.
To be clear, there is currently no Assistant to run. There are various services which can be run in Docker containers. When the model is available, you will be able to run it from the inference server we are currently developing, also using Docker.