AI Prompts for Generating Automated Server Backup Scripts with Bash and rclone

This document provides a set of two specialized AI prompts 🤖 designed to generate complete and functional Bash scripts for automating common server backup tasks. The objective is to produce reliable scripts for two primary scenarios: SQL database dumps and file system archiving.

The collection provides a targeted prompt for each scenario:

🗃️ For SQL Database Backups This prompt generates a script that utilizes mysqldump or pg_dump to create a full, clean backup of your database. The resulting .sql file can then be optionally compressed to save space.

📁 For File System Backups This prompt is tailored for archiving application directories. It generates a script using the tar utility, with the key ability to process a list of path exclusions, making it perfect for ignoring cache or temporary folders. It also includes special logic to handle non-fatal tar warnings without stopping the backup.

🔗 Core Features

Both generated scripts are built on a common, robust architecture that includes:

  • Cloud Integration: Universal upload capability to any supported cloud storage using rclone. ☁️
  • Automated Cleanup: Configurable retention policies to automatically manage and rotate old backups. ♻️
  • Operational Visibility: Built-in status logging to keep you informed of the backup process. 📊

Prompt

# =================================================================
# TOPIC: SPECIALIST DATABASE BACKUP SCRIPT (MySQL/PostgreSQL)
# =================================================================

# — ROLE AND GOAL —
# You are an expert DevOps engineer and a senior Bash scripting specialist.
# Your primary goal is to create a robust, secure, and production-ready Bash script designed specifically for backing up databases (MySQL or PostgreSQL).
# The script must include detailed logging, comprehensive error handling, and secure credential management.

# — TASK —
# Generate a comprehensive Bash script that automates the process of backing up a specified database.
# The script will perform the following actions: create a database dump (.sql file), optionally compress it, upload the final file to a remote storage provider using rclone, and then enforce a data retention policy by cleaning up old backups.
# Please generate the script based on the configuration variables and detailed logic provided below.

# —————————————————————–

# — CONFIGURATION VARIABLES —
# The generated script should pull all its configuration from these variables at the top of the file.

#!/bin/bash

# =================================================================
# SCRIPT CONFIGURATION – POPULATE THESE VARIABLES
# =================================================================

# – SOURCE DATABASE CONFIGURATION –
# The database engine. Options: “MySQL”, “PostgreSQL”
DB_ENGINE=”[FILL_THIS_IN]”

# The name of the database to be backed up.
DB_NAME=”[FILL_THIS_IN]” # Example: “production_db”

# The user for database authentication.
DB_USER=”[FILL_THIS_IN]” # Example: “backup_user”
# For security, the DB password should be stored in the .env file (see ENV_FILE_PATH).
# The variable name in the .env file should be DB_PASS.

# – PROCESSING & DESTINATION –
# The local directory on the server to store the backup file temporarily.
LOCAL_BACKUP_DIR=”[FILL_THIS_IN]” # Example: “/root/db_backups”

# Whether to compress the final .sql file into a .sql.gz archive. Options: “true”, “false”
COMPRESS_BACKUP=”true”

# The name of the configured rclone remote.
RCLONE_REMOTE_NAME=”[FILL_THIS_IN]” # Example: “b2_storage”

# The directory/bucket on the remote storage where backups will be uploaded.
REMOTE_TARGET_DIR=”[FILL_THIS_IN]” # Example: “SQL-Backups/production”

# – POLICY & LOGGING –
# The number of recent backups to keep locally and remotely.
RETENTION_COUNT=14

# Full path to the log file for appending script output.
LOG_FILE=”[FILL_THIS_IN]” # Example: “/var/log/db_backup.log”

# – SECURITY –
# Path to the .env file for storing sensitive credentials like DB_PASS.
# Leave empty to disable. Example: “/home/user/configs/db.env”
ENV_FILE_PATH=”[FILL_THIS_IN]”

# —————————————————————–

# — DETAILED SCRIPT LOGIC —
# The script must implement the following logic step-by-step.

# 1. Initialization and Pre-flight Checks:
# – Start with ‘set -e’ and ‘set -o pipefail’ to ensure the script exits immediately if any command fails.
# – Define a logging function that timestamps messages and outputs them to both the console and the specified LOG_FILE.
# – Check for required dependencies: ‘rclone’, ‘gzip’ (if compression is enabled), and ‘mysqldump’/’pg_dump’ depending on DB_ENGINE. If a dependency is missing, log a fatal error and exit.
# – If ENV_FILE_PATH is set, check if the file exists and source it to load environment variables (like DB_PASS).
# – Ensure the LOCAL_BACKUP_DIR exists, creating it if necessary (‘mkdir -p’).

# 2. Database Dump:
# – Generate a unique filename for the backup using the database name and a timestamp. Format: DBNAME-YYYY-MM-DD_HHMMSS.sql.
# – Based on DB_ENGINE, execute the correct dump command (‘mysqldump’ or ‘pg_dump’). Use the provided credentials.
# – Save the SQL dump directly to the LOCAL_BACKUP_DIR with the generated filename.
# – Verify the exit code of the dump command. If it fails, the script will stop due to ‘set -e’.

# 3. Optional Compression:
# – Check if COMPRESS_BACKUP is set to “true”.
# – If it is, compress the generated .sql file using ‘gzip’. The resulting file will have a .sql.gz extension.
# – After successful compression, delete the original .sql file to save space.

# 4. Remote Upload:
# – Determine the final backup filename (either .sql or .sql.gz).
# – Use ‘rclone copy’ to upload this file to the specified ‘RCLONE_REMOTE_NAME:REMOTE_TARGET_DIR’.
# – If the rclone command fails, log the error and exit.

# 5. Cleanup and Rotation Policy:
# – The cleanup logic must handle both compressed and uncompressed files.
# – Local Cleanup:
# – List all backup files (*.sql and *.sql.gz) in LOCAL_BACKUP_DIR, sorted by time (newest first).
# – If the file count exceeds RETENTION_COUNT, delete the oldest files until the count is correct.
# – Remote Cleanup:
# – Use ‘rclone lsf’ to get a list of remote backups, sorted by modification time.
# – If the count exceeds RETENTION_COUNT, loop through the oldest files and delete them one by one using ‘rclone deletefile’.

# 6. Finalization:
# – Log a final “Database backup process completed successfully” message.
# – The script should exit with code 0 on success.

# — FINAL INSTRUCTION —
# Please generate the complete, self-contained Bash script now based on all the requirements above.
# The code should be clean, well-commented, and optimized for database backup tasks.

Recommended Tool: GitHub Copilot Chat, ChatGPT Plus, Claude.ai, Google AI Studio, Amazon CodeWhisperer, Cursor IDE, etc.
Recommended Model: GPT-4o, GPT-4, Claude 3 Opus, Claude 3 Sonnet, Gemini 1.5 Pro, Llama 3 70B, etc.
Generating File System Archive Scripts with Exclusions (tar & rclone) Login Required