docker-db-backup icon indicating copy to clipboard operation
docker-db-backup copied to clipboard

Restore from S3 bucket

Open Mark24Slides opened this issue 1 year ago • 2 comments

Description of the feature

Add possibility to recover database backups stored in S3 bucket.

Benefits of feature

Restore from remote S3 location.

Additional context

Right now, use add. script: It copies latest S3 ${SOURCE_FILE}=mysql_.*\.gz$ file from ${S3_BUCKET}/${S3_PATH} and downloads to ${TEMP_LOCATION}/${TARGET_FILE}, then executes restore cmd

export AWS_ACCESS_KEY_ID=${S3_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${S3_KEY_SECRET}
export AWS_DEFAULT_REGION=${S3_REGION}
export PARAM_AWS_ENDPOINT_URL=" --endpoint-url ${S3_PROTOCOL}://${S3_HOST}"

mkdir -p ${TEMP_LOCATION}

export LATEST_FILE=$(aws s3 ls s3://${S3_BUCKET}/${S3_PATH}/ ${PARAM_AWS_ENDPOINT_URL} --recursive | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
aws s3 cp s3://${S3_BUCKET}/${LATEST_FILE} ${TEMP_LOCATION}/${TARGET_FILE} ${PARAM_AWS_ENDPOINT_URL}

restore ${TEMP_LOCATION}/${TARGET_FILE} ${DB_TYPE} ${DB_HOST} ${DB_NAME} ${DB_USER} ${DB_PASS} ${DB_PORT}

aws cp can be used with ${LATEST_FILE} (+ ${SOURCE_FILE} grep), or, ${SOURCE_FILE}.

Mark24Slides avatar Sep 21 '23 06:09 Mark24Slides

I have created a similar script. The feature would be very much appreciated.

export AWS_ACCESS_KEY_ID=$(cat ${DB01_S3_KEY_ID_FILE})
export AWS_SECRET_ACCESS_KEY=$(cat ${DB01_S3_KEY_SECRET_FILE})
export AWS_DEFAULT_REGION=${DB01_S3_REGION}
export SOURCE_FILE="mysql_.*\.gz$"
export BACKUP_LOCATION="/backup"
aws sts get-caller-identity

export LATEST_FILE=$(aws s3 ls s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/ | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE} ${BACKUP_LOCATION}/${TARGET_FILE}
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE}.sha1 ${BACKUP_LOCATION}/

restore ${BACKUP_LOCATION}/${TARGET_FILE} ${DB01_TYPE} ${DB01_HOST} ${DB01_NAME} ${DB01_USER} $(cat ${DB01_PASS_FILE}) ${DB01_PORT}

sakonn avatar Dec 07 '23 12:12 sakonn

Updated for v4:

  export AWS_ACCESS_KEY_ID=${DEFAULT_S3_KEY_ID}
  export AWS_SECRET_ACCESS_KEY=${DEFAULT_S3_KEY_SECRET}
  export AWS_DEFAULT_REGION=${DEFAULT_S3_REGION}
  export DEFAULT_PARAMS_AWS_ENDPOINT_URL=" --endpoint-url ${DEFAULT_S3_PROTOCOL}://${DEFAULT_S3_HOST}"
  
  mkdir -p ${TEMP_PATH}
  
  export LATEST_FILE=$(aws s3 ls s3://${DEFAULT_S3_BUCKET}/${DEFAULT_S3_PATH}/ ${DEFAULT_PARAMS_AWS_ENDPOINT_URL} --recursive | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
  aws s3 cp s3://${DEFAULT_S3_BUCKET}/${LATEST_FILE} ${TEMP_PATH}/${TARGET_FILE} ${DEFAULT_PARAMS_AWS_ENDPOINT_URL}

Mark24Slides avatar Jan 12 '24 10:01 Mark24Slides