cells icon indicating copy to clipboard operation
cells copied to clipboard

Safe storage options for containerized deployments

Open PhilsLab opened this issue 5 years ago • 2 comments

When running the pydio cells image, there is no guarantee that the local volumes will stay connected across containers, so when running e.g. on kubernetes, chances are high the pydio instance will fail with the first crashed container (even local volumes seem to require a hostname/IP).

A nice option would be to get rid of the default datasources, and be able to set an S3 datasource as the only and default datasource with the help of some environment variables.

Example:

STORAGE_ENABLE_DEFAULT_S3=1
STORAGE_S3_ENDPOINT=https://mys3service:3000
STORAGE_S3_BUCKET=mybucket
STORAGE_S3_API_KEY=myapikeyhere
STORAGE_S3_API_SECRET=myapisecrethere
STORAGE_S3_SUBPATH=/my/path/

In this case, STORAGE_ENABLE_DEFAULT_S3=1 would switch from local default datasources to an S3 datasource. This would open up the possibility to deploy on e.g. kubernetes, and also, the cells instances can be replicated (as far as i can tell), so multiple instances of cells can load-balance.

I also just noticed things like the custom images are stored somewhere else, those should be stored on a shared S3 service as well with this option.

PhilsLab avatar Apr 02 '19 19:04 PhilsLab

Hi @PhilsLab Thanks for reporting. That could be interesting indeed. The default is defined via a config, so changing this at install/startup should not be super complicated. To be checked further.

cdujeu avatar Apr 05 '19 07:04 cdujeu

I think I was able to solve this by using an install-conf.yml as described here: https://github.com/pydio/cells/tree/master/tools/docker/compose/basic

Then I used the following install-yaml:

frontendlogin: {$CELLS_ADMIN_USER}
frontendpassword: {$CELLS_ADMIN_PASSWORD}

# DB connection
dbconnectiontype: tcp
dbtcphostname: {$CELLS_DB_HOST}
dbtcpport: {$CELLS_DB_PORT}
dbtcpname: {$CELLS_DB_NAME}
dbtcpuser: {$CELLS_DB_USER}
dbtcppassword: {$CELLS_DB_PASSWORD}

# Storage connection
# "Amazon S3 / S3-compatible (store inside S3 buckets)"

# "S3-Compatible storage endpoint"
dstype: "S3"
dss3custom: {$CELLS_S3_URL}
dss3apikey: {$CELLS_S3_KEY}
dss3apisecret: {$CELLS_S3_SECRET}
dsname: "pydio"

The values between the {} I then set as ENV Variables.

nicam avatar Nov 03 '21 10:11 nicam