Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Add] : Write the docs for the restore and backup #588

Merged
merged 3 commits into from
Aug 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion scripts/backup/Dockerfile.kubectlpod
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,5 @@ RUN chmod +x kubectl
RUN mv kubectl /usr/local/bin/

COPY backup-user.bash /backup
COPY env_vars.env /backup
# COPY backup.bash /backup
# COPY env_vars.env /backup
40 changes: 40 additions & 0 deletions scripts/backup/backup.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Backup Redis to S3/ Google Cloud/ AZURE BLOB

This guide will walk you through the process of backing up Redis to S3, Google Cloud or azure blob using Docker and Kubernetes tools.

## Prerequisites

- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob.

## Steps

### 1. Select Your Backup Method

* For **Manual Backups**: Copy the backup-user.bash scrip in Dockerfile.kubectl
* For **Automated Backups** (using cronjobs/jobs): Use the backup.bash script.

> 🚨 Important: If you're utilizing the backup.bash script, environment variables must be provided.

### 2. Set Up the Backup Environment

* Run the Dockerfile.kubectl image to create a pod with kubectl and other tools installed.

> The related manifest can be found at `./scripts/backup/manifest`

* Copy the `backup-user.bash` or `backup.bash` as per the backup method.

### 3. Configure the Environment Variables

For the job/cron you need to configure env You can achieve it as explained below:

* Create a file named `env_vars.env` in your current directory.
* Populate the file with necessary environment variables.
* Source the environment file to load the variables using below command.

```bash
source ./env_vars.env
```

> For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables.

You can refer to the example `env_vars.env` file located at `./scripts/backup/env_vars.env`.
25 changes: 25 additions & 0 deletions scripts/backup/env_vars.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Set default variables

# Kubernetes Cluster
export CLUSTER_NAME="redis-cluster"
export CLUSTER_NAMESPACE="default"

# Restic
export DEFAULT_FILE_PATH="/data/dump.rdb"
export RESTIC_PASSWORD="abc@123"

# Redis
export DEFAULT_REDIS_HOST="redis-cluster-leader-0"
export DEFAULT_REDIS_PORT="6379"
export DEFAULT_REDIS_PASSWORD=""

# Backup destination

export BACKUP_DESTINATION=AWS_S3

# AWS
export AWS_S3_BUCKET=shubham-redis
export AWS_DEFAULT_REGION=ap-south-1
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=

4 changes: 0 additions & 4 deletions scripts/readme.txt

This file was deleted.

25 changes: 25 additions & 0 deletions scripts/restore/env_vars.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Set default variables

# Kubernetes Cluster
export CLUSTER_NAME="redis-cluster"
export CLUSTER_NAMESPACE="default"

# Restic
export DEFAULT_FILE_PATH="/data/dump.rdb"
export RESTIC_PASSWORD="abc@123"

# Redis
export DEFAULT_REDIS_HOST="redis-cluster-leader-0"
export DEFAULT_REDIS_PORT="6379"
export DEFAULT_REDIS_PASSWORD=""

# Backup destination

export BACKUP_DESTINATION=AWS_S3

# AWS
export AWS_S3_BUCKET=shubham-redis
export AWS_DEFAULT_REGION=ap-south-1
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=

23 changes: 23 additions & 0 deletions scripts/restore/restore.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Restore Redis from S3, Google Cloud Storage, or Azure Blob

Follow the steps below to restore a Redis backup from Amazon S3, Google Cloud Storage, or Azure Blob.

## Prerequisites

- Credentials and access to an S3 bucket, Google Cloud Storage or azure blob.

## Steps

### 1. Set Up the Restore Environment

- First, create a Docker image using the `Dockerfile.restore`. This image will encompass all necessary tools for the restoration process.

- Ensure that the `restore.bash` script is included within the `Dockerfile.restore` to be available in the container.

### 2. Manage Environment Variables through Kubernetes Secrets

- For a more secure approach, utilize Kubernetes secrets to manage and pass your environment variables.

- The template for the necessary environment variables can be found at `./restore/env_vars.bash`.

> Note : You have to pass image in the init container to backup the redis data. Since dump.rbd file should be loaded before the redis server starts.
Loading