BlogTutorialsHow to Back Up Everything to S3-Compatible Storage: The Complete Guide (2025)

How to Back Up Everything to S3-Compatible Storage: The Complete Guide (2025)

Adrian Silaghi
Adrian Silaghi
January 5, 2026
18 min read
9 views
#backup #s3 #object storage #android #mac #windows #wordpress #linux #nas #rclone #restic
How to Back Up Everything to S3-Compatible Storage: The Complete Guide (2025)

Your data is scattered everywhere—photos on your phone, documents on your laptop, databases on your servers, and media files on your NAS. What if you could back up everything to one secure, affordable location?

S3-compatible object storage offers the perfect solution: unlimited scalability, 99.99% durability, and access from any device or platform. In this comprehensive guide, we'll show you exactly how to back up data from every major platform to S3-compatible storage like DanubeData, MinIO, Wasabi, or Backblaze B2.

Why S3-Compatible Storage for Backups?

Before we dive into the how-to guides, let's understand why S3-compatible storage has become the gold standard for backups:

Feature Traditional Backup S3-Compatible Storage
Storage Capacity Limited by hardware Unlimited
Durability Single point of failure 99.999999999% (11 9s)
Accessibility Local network only Access from anywhere
Cost High upfront hardware cost Pay only for what you use
Maintenance Hardware failures, updates Zero maintenance
Encryption Optional, complex Built-in at rest & in transit

What You'll Need

To follow this guide, you'll need:

  • An S3-compatible storage account (we'll use DanubeData examples)
  • Your endpoint URL, access key, and secret key
  • A bucket created for your backups

If you're using DanubeData Object Storage, you can find these credentials in your dashboard under Storage → Access Keys.

1. Backing Up Android Devices

Your Android phone contains irreplaceable photos, messages, and app data. Here's how to automatically back everything up to S3 storage.

Method 1: FolderSync Pro (Recommended)

FolderSync Pro ($5.99) is the most reliable Android app for S3 backups with scheduling, filters, and two-way sync.

Setup Steps:

  1. Install FolderSync Pro from the Play Store
  2. Tap AccountsAdd Account
  3. Select Amazon S3 (works with any S3-compatible storage)
  4. Configure your connection:
# FolderSync S3 Configuration
Account Name: DanubeData Backup
Login (Access Key): YOUR_ACCESS_KEY
Password (Secret Key): YOUR_SECRET_KEY
Server Address: s3.danubedata.com
Port: 443
Use SSL: Yes
Use Path Style: Yes
Region: eu-central-1

Create a Backup Folder Pair:

  1. Tap FolderpairsAdd Folderpair
  2. Name it "Phone Photos Backup"
  3. Select your DanubeData account
  4. Choose sync type: To Remote (one-way backup)
  5. Local folder: /DCIM/Camera
  6. Remote folder: /android-backup/photos
  7. Enable Scheduled sync: Daily at 2:00 AM (while charging)

Recommended Folders to Back Up:

# Essential Android backup paths
/DCIM/Camera          → Photos and videos
/Pictures/Screenshots → Screenshots
/Download             → Downloaded files
/Documents            → Documents
/WhatsApp/Media       → WhatsApp photos/videos
/DCIM/Facebook        → Facebook photos
/Recordings           → Voice recordings

Method 2: Autosync for S3 (Free Option)

For a free alternative, Autosync for Amazon S3 provides basic S3 backup functionality:

  1. Install from Play Store
  2. Add S3 account with custom endpoint
  3. Select folders to sync
  4. Enable Wi-Fi only uploads to save mobile data

Method 3: Termux + Rclone (Power Users)

For complete control, use rclone in Termux:

# Install Termux from F-Droid (not Play Store)
# Open Termux and run:
pkg update && pkg upgrade
pkg install rclone

# Configure rclone
rclone config

# n) New remote
# name> danubedata
# Storage> s3
# provider> Other
# access_key_id> YOUR_ACCESS_KEY
# secret_access_key> YOUR_SECRET_KEY
# region> eu-central-1
# endpoint> https://s3.danubedata.com

# Test connection
rclone ls danubedata:your-bucket

# Backup photos
rclone sync /sdcard/DCIM/Camera danubedata:your-bucket/android/photos

# Create automated backup script
cat << 'EOF' > ~/backup.sh
#!/bin/bash
rclone sync /sdcard/DCIM danubedata:backups/android/dcim --progress
rclone sync /sdcard/Documents danubedata:backups/android/documents --progress
EOF
chmod +x ~/backup.sh

2. Backing Up Mac Computers

macOS users have excellent options for S3 backups, from GUI apps to powerful command-line tools.

Method 1: Arq Backup (Best for Mac)

Arq Backup ($49.99 one-time) is the gold standard for Mac backups with S3 support, versioning, and client-side encryption.

Setup Steps:

  1. Download and install Arq Backup
  2. Click Add Storage LocationS3-Compatible
  3. Enter your credentials:
# Arq S3-Compatible Configuration
Server: s3.danubedata.com
Access Key ID: YOUR_ACCESS_KEY
Secret Access Key: YOUR_SECRET_KEY
Bucket Name: your-backup-bucket
Path: /mac-backups
Region: eu-central-1

Configure What to Back Up:

  1. Click Add Backup Plan
  2. Select folders to include:
# Recommended Mac folders to back up
~/Documents           → All your documents
~/Desktop             → Desktop files
~/Pictures            → Photos library and images
~/Movies              → Video files
~/Music               → Music library
~/Projects            → Development projects
~/Library/Application Support  → App data

Set Backup Schedule:

  • Hourly backups for important documents
  • Daily backups for photos and media
  • Enable Back up while on battery only if needed

Method 2: Rclone (Free, Powerful)

Rclone is a free, open-source command-line tool that syncs files to S3 and 50+ other cloud storage providers.

Installation:

# Install with Homebrew
brew install rclone

# Or download directly
curl https://rclone.org/install.sh | sudo bash

Configuration:

# Interactive configuration
rclone config

# Follow the prompts:
# n) New remote
# name> danubedata
# Storage> s3
# provider> Other (S3 Compatible)
# env_auth> false
# access_key_id> YOUR_ACCESS_KEY
# secret_access_key> YOUR_SECRET_KEY
# region> eu-central-1
# endpoint> https://s3.danubedata.com
# location_constraint> (leave empty)
# acl> private
# Edit advanced config?> n
# Keep this remote?> y

Backup Commands:

# Sync Documents folder
rclone sync ~/Documents danubedata:your-bucket/mac/documents --progress

# Sync with bandwidth limit (10 MB/s)
rclone sync ~/Pictures danubedata:your-bucket/mac/pictures --bwlimit 10M --progress

# Backup with encryption (client-side)
rclone sync ~/Documents danubedata:your-bucket/mac/documents-encrypted --crypt-remote

# Dry run (see what would be uploaded)
rclone sync ~/Documents danubedata:your-bucket/mac/documents --dry-run

# Sync only specific file types
rclone sync ~/Documents danubedata:your-bucket/mac/documents --include "*.pdf" --include "*.docx"

Automated Backup with launchd:

# Create launch agent
cat << 'EOF' > ~/Library/LaunchAgents/com.rclone.backup.plist
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.rclone.backup</string>
    <key>ProgramArguments</key>
    <array>
        <string>/usr/local/bin/rclone</string>
        <string>sync</string>
        <string>/Users/YOUR_USERNAME/Documents</string>
        <string>danubedata:your-bucket/mac/documents</string>
        <string>--log-file=/tmp/rclone-backup.log</string>
    </array>
    <key>StartCalendarInterval</key>
    <dict>
        <key>Hour</key>
        <integer>3</integer>
        <key>Minute</key>
        <integer>0</integer>
    </dict>
    <key>RunAtLoad</key>
    <false/>
</dict>
</plist>
EOF

# Load the launch agent
launchctl load ~/Library/LaunchAgents/com.rclone.backup.plist

Method 3: Cyberduck (GUI for Manual Backups)

Cyberduck is a free file browser that supports S3-compatible storage with drag-and-drop uploads.

  1. Download Cyberduck
  2. Click Open Connection
  3. Select Amazon S3
  4. Server: s3.danubedata.com
  5. Enter Access Key ID and Secret Access Key
  6. Drag and drop files to upload

3. Backing Up Windows PCs

Windows users have several excellent options for backing up to S3 storage.

Method 1: Duplicati (Free, Open Source)

Duplicati is the best free backup solution for Windows with S3 support, encryption, and a web-based UI.

Installation:

  1. Download from duplicati.com
  2. Install and launch—it opens a web interface at http://localhost:8200

Setup S3 Backup:

  1. Click Add Backup
  2. Choose Configure a new backup
  3. Set backup name: "Windows Backup to S3"
  4. Select S3 Compatible as destination
  5. Configure connection:
# Duplicati S3 Configuration
Server (custom): s3.danubedata.com
Bucket name: your-backup-bucket
Bucket create region: eu-central-1
Storage class: Standard
AWS Access ID: YOUR_ACCESS_KEY
AWS Secret Key: YOUR_SECRET_KEY
Client library: MinIO SDK (or AWS SDK)

Select Source Data:

# Recommended Windows folders
C:UsersYourNameDocuments
C:UsersYourNameDesktop
C:UsersYourNamePictures
C:UsersYourNameVideos
C:UsersYourNameDownloads
D:Projects (if you have a projects drive)

Configure Schedule:

  • Daily at 2:00 AM
  • Keep backups for 30 days
  • Smart retention: Keep 7 daily, 4 weekly, 12 monthly versions

Method 2: Rclone for Windows

# Download rclone for Windows
# https://rclone.org/downloads/

# Extract and add to PATH
# Open Command Prompt or PowerShell

# Configure
rclone config
# Follow same steps as Mac section

# Basic backup
rclone sync "C:UsersYourNameDocuments" danubedata:your-bucket/windows/documents

# Create a batch script for automated backups
# backup.bat
@echo off
rclone sync "C:Users\%USERNAME%Documents" danubedata:your-bucket/windows/documents --progress
rclone sync "C:Users\%USERNAME%Pictures" danubedata:your-bucket/windows/pictures --progress
rclone sync "C:Users\%USERNAME%Desktop" danubedata:your-bucket/windows/desktop --progress

Schedule with Task Scheduler:

  1. Open Task Scheduler
  2. Create Basic Task → "S3 Backup"
  3. Trigger: Daily at 3:00 AM
  4. Action: Start a program → C: cloneackup.bat
  5. Enable "Run whether user is logged on or not"

Method 3: CloudBerry Backup (MSP Tools)

For enterprise Windows backup with advanced features like bare-metal recovery:

  1. Download CloudBerry Backup (now MSP360)
  2. Add S3-compatible storage account
  3. Configure backup plan with VSS for open files
  4. Enable image-based backup for system recovery

4. Backing Up WordPress Sites

WordPress sites need regular backups of both files and databases. Here's how to automate this to S3 storage.

Method 1: UpdraftPlus (Easiest)

UpdraftPlus is the most popular WordPress backup plugin with native S3 support.

Installation:

  1. Install UpdraftPlus from WordPress plugins
  2. Go to Settings → UpdraftPlus Backups
  3. Click Settings tab

Configure S3 Storage:

  1. Under "Choose your remote storage", select S3-Compatible (Generic)
  2. Enter your credentials:
# UpdraftPlus S3-Compatible Configuration
S3 access key: YOUR_ACCESS_KEY
S3 secret key: YOUR_SECRET_KEY
S3 location: your-bucket-name/wordpress-backups
S3 endpoint: s3.danubedata.com

Set Backup Schedule:

# Recommended schedule for WordPress
Files backup schedule: Daily (retain 7 backups)
Database backup schedule: Daily (retain 14 backups)

# What to include:
✓ Plugins
✓ Themes
✓ Uploads
✓ Other directories in wp-content
✓ Database

Test Your Backup:

  1. Click Backup Now
  2. Verify backup appears in S3 bucket
  3. Test restore on a staging site

Method 2: WP-CLI + Rclone (Developers)

For developers who prefer command-line control:

#!/bin/bash
# wordpress-backup.sh

# Variables
SITE_PATH="/var/www/wordpress"
BACKUP_DIR="/tmp/wp-backup"
BUCKET="your-bucket"
DATE=$(date +%Y-%m-%d)

# Create backup directory
mkdir -p $BACKUP_DIR

# Backup database
wp db export $BACKUP_DIR/database-$DATE.sql --path=$SITE_PATH

# Backup wp-content
tar -czf $BACKUP_DIR/wp-content-$DATE.tar.gz -C $SITE_PATH wp-content

# Upload to S3
rclone copy $BACKUP_DIR danubedata:$BUCKET/wordpress/$DATE/

# Clean up local backups
rm -rf $BACKUP_DIR

# Delete backups older than 30 days from S3
rclone delete danubedata:$BUCKET/wordpress --min-age 30d

echo "WordPress backup completed: $DATE"

Add to Crontab:

# Daily backup at 4 AM
0 4 * * * /path/to/wordpress-backup.sh >> /var/log/wp-backup.log 2>&1

Method 3: BackWPup (Free Alternative)

BackWPup is a free plugin with S3 support:

  1. Install BackWPup from WordPress plugins
  2. Create new job → select "S3 Service"
  3. Configure with custom S3 endpoint
  4. Set schedule and retention

5. Backing Up Linux Servers

Linux servers require robust, scriptable backup solutions. Here are the best approaches.

Method 1: Restic (Recommended)

Restic is a modern backup program with built-in encryption, deduplication, and S3 support.

Installation:

# Ubuntu/Debian
apt install restic

# Or download latest
wget https://github.com/restic/restic/releases/latest/download/restic_linux_amd64.bz2
bunzip2 restic_linux_amd64.bz2
chmod +x restic_linux_amd64
mv restic_linux_amd64 /usr/local/bin/restic

Initialize Repository:

# Set environment variables
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"
export RESTIC_PASSWORD="your-encryption-password"
export RESTIC_REPOSITORY="s3:https://s3.danubedata.com/your-bucket/server-backups"

# Initialize (first time only)
restic init

Backup Commands:

# Backup home directory
restic backup /home

# Backup with exclusions
restic backup /var/www 
  --exclude="*.log" 
  --exclude="node_modules" 
  --exclude=".git" 
  --exclude="vendor"

# Backup multiple directories
restic backup /etc /var/www /home --verbose

# Backup with tags
restic backup /var/www --tag webserver --tag production

# View snapshots
restic snapshots

# Restore specific snapshot
restic restore abc123 --target /restore/path

Automated Backup Script:

#!/bin/bash
# /usr/local/bin/restic-backup.sh

export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"
export RESTIC_PASSWORD="your-encryption-password"
export RESTIC_REPOSITORY="s3:https://s3.danubedata.com/your-bucket/server-backups"

# Run backup
restic backup 
  /etc 
  /home 
  /var/www 
  /var/lib/mysql 
  --exclude="*.log" 
  --exclude="node_modules" 
  --exclude=".cache" 
  --verbose

# Clean up old snapshots (keep 7 daily, 4 weekly, 6 monthly)
restic forget 
  --keep-daily 7 
  --keep-weekly 4 
  --keep-monthly 6 
  --prune

# Check repository integrity (weekly)
if [ $(date +%u) -eq 7 ]; then
  restic check
fi

echo "Backup completed at $(date)"

Add to Crontab:

# Daily backup at 2 AM
0 2 * * * /usr/local/bin/restic-backup.sh >> /var/log/restic-backup.log 2>&1

Method 2: Duplicity (Encrypted, Incremental)

Duplicity creates encrypted, bandwidth-efficient incremental backups.

# Install
apt install duplicity python3-boto3

# Backup command
duplicity 
  --s3-endpoint-url https://s3.danubedata.com 
  /var/www 
  s3://s3.danubedata.com/your-bucket/server-backup

# Full backup monthly, incremental daily
duplicity 
  --full-if-older-than 30D 
  --s3-endpoint-url https://s3.danubedata.com 
  /var/www 
  s3://s3.danubedata.com/your-bucket/server-backup

# Restore
duplicity restore 
  --s3-endpoint-url https://s3.danubedata.com 
  s3://s3.danubedata.com/your-bucket/server-backup 
  /restore/path

Method 3: Database-Specific Backups

MySQL/MariaDB:

#!/bin/bash
# mysql-backup.sh

DB_USER="backup_user"
DB_PASS="backup_password"
DATE=$(date +%Y-%m-%d_%H-%M)

# Dump all databases
mysqldump -u$DB_USER -p$DB_PASS --all-databases | gzip > /tmp/mysql-$DATE.sql.gz

# Upload to S3
rclone copy /tmp/mysql-$DATE.sql.gz danubedata:your-bucket/mysql-backups/

# Clean up
rm /tmp/mysql-$DATE.sql.gz

PostgreSQL:

#!/bin/bash
# postgresql-backup.sh

DATE=$(date +%Y-%m-%d_%H-%M)

# Dump all databases
pg_dumpall | gzip > /tmp/postgres-$DATE.sql.gz

# Upload to S3
rclone copy /tmp/postgres-$DATE.sql.gz danubedata:your-bucket/postgres-backups/

# Clean up
rm /tmp/postgres-$DATE.sql.gz

6. Backing Up NAS Devices

NAS devices like Synology, QNAP, and TrueNAS have built-in S3 backup support.

Synology NAS (Hyper Backup)

  1. Open Hyper Backup from Package Center
  2. Click +Data backup task
  3. Select S3 Storage as destination
  4. Configure connection:
# Synology Hyper Backup S3 Configuration
Server type: Custom server URL
Server address: s3.danubedata.com
Signature version: v4
Access key: YOUR_ACCESS_KEY
Secret key: YOUR_SECRET_KEY
Bucket name: your-backup-bucket
Directory: /synology-backup

Select Shared Folders:

  • Check folders to back up (photos, documents, media)
  • Optionally include application settings
  • Set backup schedule (daily recommended)

QNAP NAS (Hybrid Backup Sync)

  1. Open Hybrid Backup Sync
  2. Go to Backup & Restore
  3. Create new backup job
  4. Select S3-Compatible as cloud storage
  5. Enter S3 credentials and endpoint

TrueNAS (Cloud Sync Tasks)

  1. Go to TasksCloud Sync Tasks
  2. Add new credential → S3
  3. Enter access key, secret key, and endpoint URL
  4. Create sync task with schedule

7. Backing Up Docker Volumes

Docker containers often store data in volumes that need backing up.

#!/bin/bash
# docker-backup.sh

BACKUP_DIR="/tmp/docker-backups"
DATE=$(date +%Y-%m-%d)

mkdir -p $BACKUP_DIR

# List all volumes
docker volume ls -q | while read volume; do
  echo "Backing up volume: $volume"

  # Create temporary container to access volume
  docker run --rm 
    -v $volume:/source:ro 
    -v $BACKUP_DIR:/backup 
    alpine 
    tar -czf /backup/$volume-$DATE.tar.gz -C /source .
done

# Upload to S3
rclone sync $BACKUP_DIR danubedata:your-bucket/docker-volumes/$DATE/

# Clean up
rm -rf $BACKUP_DIR

echo "Docker volume backup completed"

8. Backing Up Kubernetes Volumes

For Kubernetes persistent volumes, use Velero:

# Install Velero with S3 backend
velero install 
  --provider aws 
  --plugins velero/velero-plugin-for-aws:v1.8.0 
  --bucket your-bucket 
  --secret-file ./credentials-velero 
  --backup-location-config 
    region=eu-central-1,s3ForcePathStyle=true,s3Url=https://s3.danubedata.com

# Create backup
velero backup create my-backup --include-namespaces my-app

# Schedule daily backups
velero schedule create daily-backup 
  --schedule="0 2 * * *" 
  --include-namespaces my-app 
  --ttl 720h  # Keep for 30 days

# Restore
velero restore create --from-backup my-backup

Best Practices for S3 Backups

1. Enable Versioning

Versioning protects against accidental deletions and overwrites:

# Enable bucket versioning
aws s3api put-bucket-versioning 
  --bucket your-bucket 
  --versioning-configuration Status=Enabled 
  --endpoint-url https://s3.danubedata.com

2. Set Up Lifecycle Rules

Automatically delete old backups to manage costs:

# lifecycle.json
{
  "Rules": [
    {
      "ID": "Delete old backups",
      "Status": "Enabled",
      "Filter": {"Prefix": "backups/"},
      "Expiration": {"Days": 90},
      "NoncurrentVersionExpiration": {"NoncurrentDays": 30}
    }
  ]
}

# Apply lifecycle policy
aws s3api put-bucket-lifecycle-configuration 
  --bucket your-bucket 
  --lifecycle-configuration file://lifecycle.json 
  --endpoint-url https://s3.danubedata.com

3. Use Client-Side Encryption

For sensitive data, encrypt before uploading:

# Rclone with crypt overlay
rclone config
# Create new remote with type "crypt"
# Point to your S3 remote as the encrypted destination

4. Monitor Backup Health

Set up alerts for backup failures:

# Add to backup script
if [ $? -ne 0 ]; then
  curl -X POST "https://hooks.slack.com/your-webhook" 
    -H "Content-Type: application/json" 
    -d '{"text": "Backup FAILED on '$(hostname)' at '$(date)'"}'
fi

5. Test Restores Regularly

A backup is only as good as its ability to restore:

# Monthly restore test
restic restore latest --target /tmp/restore-test --include "/etc"
diff -r /etc /tmp/restore-test/etc

Backup Checklist by Platform

Platform Best Tool Schedule Key Setting
Android FolderSync Pro Daily (night) Wi-Fi only
Mac Arq or rclone Hourly/Daily Exclude caches
Windows Duplicati Daily VSS for open files
WordPress UpdraftPlus Daily DB, Weekly files Exclude logs
Linux Server Restic Daily Encryption enabled
Synology NAS Hyper Backup Daily Compression on
Docker rclone + script Daily Stop containers first

Get Started with DanubeData Object Storage

Ready to start backing up your data? DanubeData Object Storage gives you:

  • S3-Compatible API - Works with all tools in this guide
  • European Data Centers - GDPR-compliant storage in Germany
  • 1TB Included - Base plan includes 1TB storage + 1TB transfer
  • Versioning - Protect against accidental deletions
  • 99.99% Durability - Your data is safe
  • Simple Pricing - €3.99/month base, no hidden fees

👉 Create Your First Bucket - Start backing up in minutes

Already have data in AWS S3 or another provider? Check out our migration guide to move your backups to European storage.

Questions about backing up your specific setup? Contact our team—we're happy to help you design the perfect backup strategy.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.