Skip to content

Commit

Permalink
[Add] : Write the docs for the restore and backup (#588)
Browse files Browse the repository at this point in the history
* Write the docs for the restore and backup

Signed-off-by: Shubham Gupta <[email protected]>

* add Prerequisites

Signed-off-by: Shubham Gupta <[email protected]>

* typo

Signed-off-by: Shubham Gupta <[email protected]>

---------

Signed-off-by: Shubham Gupta <[email protected]>
  • Loading branch information
shubham-cmyk authored Aug 25, 2023
1 parent 3485ef7 commit d2b483e
Show file tree
Hide file tree
Showing 6 changed files with 115 additions and 5 deletions.
3 changes: 2 additions & 1 deletion scripts/backup/Dockerfile.kubectlpod
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,5 @@ RUN chmod +x kubectl
RUN mv kubectl /usr/local/bin/

COPY backup-user.bash /backup
COPY env_vars.env /backup
# COPY backup.bash /backup
# COPY env_vars.env /backup
40 changes: 40 additions & 0 deletions scripts/backup/backup.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Backup Redis to S3/ Google Cloud/ AZURE BLOB

This guide will walk you through the process of backing up Redis to S3, Google Cloud or azure blob using Docker and Kubernetes tools.

## Prerequisites

- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob.

## Steps

### 1. Select Your Backup Method

* For **Manual Backups**: Copy the backup-user.bash scrip in Dockerfile.kubectl
* For **Automated Backups** (using cronjobs/jobs): Use the backup.bash script.

> 🚨 Important: If you're utilizing the backup.bash script, environment variables must be provided.
### 2. Set Up the Backup Environment

* Run the Dockerfile.kubectl image to create a pod with kubectl and other tools installed.

> The related manifest can be found at `./scripts/backup/manifest`
* Copy the `backup-user.bash` or `backup.bash` as per the backup method.

### 3. Configure the Environment Variables

For the job/cron you need to configure env You can achieve it as explained below:

* Create a file named `env_vars.env` in your current directory.
* Populate the file with necessary environment variables.
* Source the environment file to load the variables using below command.

```bash
source ./env_vars.env
```

> For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables.
You can refer to the example `env_vars.env` file located at `./scripts/backup/env_vars.env`.
25 changes: 25 additions & 0 deletions scripts/backup/env_vars.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Set default variables

# Kubernetes Cluster
export CLUSTER_NAME="redis-cluster"
export CLUSTER_NAMESPACE="default"

# Restic
export DEFAULT_FILE_PATH="/data/dump.rdb"
export RESTIC_PASSWORD="abc@123"

# Redis
export DEFAULT_REDIS_HOST="redis-cluster-leader-0"
export DEFAULT_REDIS_PORT="6379"
export DEFAULT_REDIS_PASSWORD=""

# Backup destination

export BACKUP_DESTINATION=AWS_S3

# AWS
export AWS_S3_BUCKET=shubham-redis
export AWS_DEFAULT_REGION=ap-south-1
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=

4 changes: 0 additions & 4 deletions scripts/readme.txt

This file was deleted.

25 changes: 25 additions & 0 deletions scripts/restore/env_vars.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Set default variables

# Kubernetes Cluster
export CLUSTER_NAME="redis-cluster"
export CLUSTER_NAMESPACE="default"

# Restic
export DEFAULT_FILE_PATH="/data/dump.rdb"
export RESTIC_PASSWORD="abc@123"

# Redis
export DEFAULT_REDIS_HOST="redis-cluster-leader-0"
export DEFAULT_REDIS_PORT="6379"
export DEFAULT_REDIS_PASSWORD=""

# Backup destination

export BACKUP_DESTINATION=AWS_S3

# AWS
export AWS_S3_BUCKET=shubham-redis
export AWS_DEFAULT_REGION=ap-south-1
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=

23 changes: 23 additions & 0 deletions scripts/restore/restore.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Restore Redis from S3, Google Cloud Storage, or Azure Blob

Follow the steps below to restore a Redis backup from Amazon S3, Google Cloud Storage, or Azure Blob.

## Prerequisites

- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob.

## Steps

### 1. Set Up the Restore Environment

- First, create a Docker image using the `Dockerfile.restore`. This image will encompass all necessary tools for the restoration process.

- Ensure that the `restore.bash` script is included within the `Dockerfile.restore` to be available in the container.

### 2. Manage Environment Variables through Kubernetes Secrets

- For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables.

- The template for the necessary environment variables can be found at `./restore/env_vars.bash`.

> Note : You have to pass image in the init container to backup the redis data. Since dump.rbd file should be loaded before the redis server starts.

0 comments on commit d2b483e

Please sign in to comment.