Configure CWP to Backup to AWS S3

Backing Up Centos Web panel to S3

Basic Requirements

To create a script for backing up accounts and databases from CentOS Web Panel (CWP) to AWS S3, follow these steps:

  1. Install AWS CLI: If not already installed, install the AWS CLI on your CentOS server.
  2. Configure AWS CLI: Configure your AWS CLI with your access and secret keys.
  3. Create a script: Write a bash script to backup accounts and databases, and upload them to AWS S3.

Here’s a step-by-step guide:

Step 1: Install AWS CLI

sudo yum install awscli -y

Step 2: Configure AWS CLI

aws configure

You’ll be prompted to enter your AWS Access Key ID, Secret Access Key, Default region name, and Default output format.

Step 3: Create the Backup Script - Option 1

Create a new script file, e.g., backup_to_s3.sh, and make it executable.

This option will create a compress file of the entire /home/ directory and its files.

#!/bin/bash

# Set variables
BACKUP_DIR="/backup"
DATE=$(date +%Y-%m-%d)
S3_BUCKET="your-s3-bucket-name"
S3_PATH="s3://$S3_BUCKET/$DATE"
MYSQL_USER="your_mysql_user"
MYSQL_PASSWORD="your_mysql_password"

# Create backup directory if not exists
mkdir -p $BACKUP_DIR/$DATE

# Backup accounts
tar -czf $BACKUP_DIR/$DATE/accounts_backup.tar.gz /home/*

# Backup databases
databases=$(mysql -u$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database)
for db in $databases; do
  mysqldump -u$MYSQL_USER -p$MYSQL_PASSWORD --databases $db > $BACKUP_DIR/$DATE/$db.sql
done

# Upload to S3
aws s3 cp $BACKUP_DIR/$DATE $S3_PATH --recursive

# Remove local backup
rm -rf $BACKUP_DIR/$DATE

echo "Backup completed and uploaded to S3."

Step 3: Create the Backup Script - Option 2

Create a new script file, e.g., backup_to_s3.sh, and make it executable.

This option will create a compress file for each directory within the /home/ directory and its files.

#!/bin/bash

# Set variables
BACKUP_DIR="/backup"
DATE=$(date +%Y-%m-%d)
S3_BUCKET="your-s3-bucket-name"
S3_PATH="s3://$S3_BUCKET/$DATE"
MYSQL_USER="your_mysql_user"
MYSQL_PASSWORD="your_mysql_password"

# Create backup directory if not exists
mkdir -p $BACKUP_DIR/$DATE

# Backup each user's home directory
for user in /home/*; do
  if [ -d "$user" ]; then
    username=$(basename $user)
    tar -czf $BACKUP_DIR/$DATE/${username}_backup.tar.gz $user
  fi
done

# Backup databases
databases=$(mysql -u$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database)
for db in $databases; do
  mysqldump -u$MYSQL_USER -p$MYSQL_PASSWORD --databases $db > $BACKUP_DIR/$DATE/$db.sql
done

# Upload to S3
aws s3 cp $BACKUP_DIR/$DATE $S3_PATH --recursive

# Remove

Step 4: Make the Script Executable

chmod +x backup_to_s3.sh

Step 5: Automate the Backup

You can set up a cron job to run this script periodically. Edit the cron jobs by running:

crontab -e

Add the following line to schedule the script to run daily at 2 AM:

0 2 * * * /path/to/backup_to_s3.sh

Replace /path/to/backup_to_s3.sh with the actual path to your script.

Notes:

  • Replace your-s3-bucket-name with your actual S3 bucket name.
  • Replace your_mysql_user and your_mysql_password with your MySQL credentials.
  • Adjust the paths and file names according to your setup.

This script will create backups of your accounts and databases, upload them to AWS S3, and clean up the local backup files.