diff --git a/scripts/backup/Dockerfile.kubectlpod b/scripts/backup/Dockerfile.kubectlpod index 6f4bd8062..c465cf548 100644 --- a/scripts/backup/Dockerfile.kubectlpod +++ b/scripts/backup/Dockerfile.kubectlpod @@ -16,4 +16,5 @@ RUN chmod +x kubectl RUN mv kubectl /usr/local/bin/ COPY backup-user.bash /backup -COPY env_vars.env /backup +# COPY backup.bash /backup +# COPY env_vars.env /backup diff --git a/scripts/backup/backup.md b/scripts/backup/backup.md new file mode 100644 index 000000000..6342266e4 --- /dev/null +++ b/scripts/backup/backup.md @@ -0,0 +1,40 @@ +# Backup Redis to S3/ Google Cloud/ AZURE BLOB + +This guide will walk you through the process of backing up Redis to S3, Google Cloud or azure blob using Docker and Kubernetes tools. + +## Prerequisites + +- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob. + +## Steps + +### 1. Select Your Backup Method + +* For **Manual Backups**: Copy the backup-user.bash scrip in Dockerfile.kubectl +* For **Automated Backups** (using cronjobs/jobs): Use the backup.bash script. + +> 🚨 Important: If you're utilizing the backup.bash script, environment variables must be provided. + +### 2. Set Up the Backup Environment + +* Run the Dockerfile.kubectl image to create a pod with kubectl and other tools installed. + +> The related manifest can be found at `./scripts/backup/manifest` + +* Copy the `backup-user.bash` or `backup.bash` as per the backup method. + +### 3. Configure the Environment Variables + +For the job/cron you need to configure env You can achieve it as explained below: + +* Create a file named `env_vars.env` in your current directory. +* Populate the file with necessary environment variables. +* Source the environment file to load the variables using below command. + +```bash +source ./env_vars.env +``` + +> For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables. + +You can refer to the example `env_vars.env` file located at `./scripts/backup/env_vars.env`. diff --git a/scripts/backup/env_vars.env b/scripts/backup/env_vars.env new file mode 100644 index 000000000..d8ec3829a --- /dev/null +++ b/scripts/backup/env_vars.env @@ -0,0 +1,25 @@ +# Set default variables + +# Kubernetes Cluster +export CLUSTER_NAME="redis-cluster" +export CLUSTER_NAMESPACE="default" + +# Restic +export DEFAULT_FILE_PATH="/data/dump.rdb" +export RESTIC_PASSWORD="abc@123" + +# Redis +export DEFAULT_REDIS_HOST="redis-cluster-leader-0" +export DEFAULT_REDIS_PORT="6379" +export DEFAULT_REDIS_PASSWORD="" + +# Backup destination + +export BACKUP_DESTINATION=AWS_S3 + +# AWS +export AWS_S3_BUCKET=shubham-redis +export AWS_DEFAULT_REGION=ap-south-1 +export AWS_ACCESS_KEY_ID= +export AWS_SECRET_ACCESS_KEY= + diff --git a/scripts/readme.txt b/scripts/readme.txt deleted file mode 100644 index e2e86a7ef..000000000 --- a/scripts/readme.txt +++ /dev/null @@ -1,4 +0,0 @@ -You can pass all the value as the env Variable here: -Create a file env_vars.env in the home directory and -run: source /backup/env_vars.env to load the env Variable - diff --git a/scripts/restore/env_vars.env b/scripts/restore/env_vars.env new file mode 100644 index 000000000..d8ec3829a --- /dev/null +++ b/scripts/restore/env_vars.env @@ -0,0 +1,25 @@ +# Set default variables + +# Kubernetes Cluster +export CLUSTER_NAME="redis-cluster" +export CLUSTER_NAMESPACE="default" + +# Restic +export DEFAULT_FILE_PATH="/data/dump.rdb" +export RESTIC_PASSWORD="abc@123" + +# Redis +export DEFAULT_REDIS_HOST="redis-cluster-leader-0" +export DEFAULT_REDIS_PORT="6379" +export DEFAULT_REDIS_PASSWORD="" + +# Backup destination + +export BACKUP_DESTINATION=AWS_S3 + +# AWS +export AWS_S3_BUCKET=shubham-redis +export AWS_DEFAULT_REGION=ap-south-1 +export AWS_ACCESS_KEY_ID= +export AWS_SECRET_ACCESS_KEY= + diff --git a/scripts/restore/restore.md b/scripts/restore/restore.md new file mode 100644 index 000000000..8df558932 --- /dev/null +++ b/scripts/restore/restore.md @@ -0,0 +1,23 @@ +# Restore Redis from S3, Google Cloud Storage, or Azure Blob + +Follow the steps below to restore a Redis backup from Amazon S3, Google Cloud Storage, or Azure Blob. + +## Prerequisites + +- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob. + +## Steps + +### 1. Set Up the Restore Environment + +- First, create a Docker image using the `Dockerfile.restore`. This image will encompass all necessary tools for the restoration process. + +- Ensure that the `restore.bash` script is included within the `Dockerfile.restore` to be available in the container. + +### 2. Manage Environment Variables through Kubernetes Secrets + +- For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables. + +- The template for the necessary environment variables can be found at `./restore/env_vars.bash`. + +> Note : You have to pass image in the init container to backup the redis data. Since dump.rbd file should be loaded before the redis server starts.