Your NAS is likely the central hub for your family photos, home videos, important documents, and media library. But what happens when the NAS itself fails, gets stolen, or your house experiences a fire or flood?
This guide shows you how to set up automatic off-site backups from your Synology, QNAP, TrueNAS, or Unraid NAS to S3-compatible cloud storage.
Why Back Up Your NAS to the Cloud?
RAID is not backup. Your NAS protects against drive failures, but not against:
- Ransomware - Encrypts all files including those on NAS
- Theft - NAS gets stolen along with your other electronics
- Fire/Flood - Physical disasters destroy everything on-site
- Accidental Deletion - User error can wipe important folders
- NAS Failure - Controller, motherboard, or multiple drives fail simultaneously
- Corruption - Silent data corruption can spread to all drives
| Threat | RAID Protection | S3 Backup |
|---|---|---|
| Single Drive Failure | Protected | Protected |
| Multiple Drive Failure | Depends on RAID level | Protected |
| Ransomware | NOT protected | Protected (versioning) |
| Theft | NOT protected | Protected |
| Fire/Flood | NOT protected | Protected |
| Accidental Deletion | NOT protected | Protected (versioning) |
What You'll Need
- A NAS device (Synology, QNAP, TrueNAS, or Unraid)
- S3-compatible storage credentials:
- Endpoint:
s3.danubedata.com - Access Key ID
- Secret Access Key
- Bucket name
- Endpoint:
Synology NAS (Hyper Backup)
Synology's Hyper Backup is an excellent built-in backup tool with S3 support.
Step 1: Install Hyper Backup
- Open Package Center
- Search for "Hyper Backup"
- Click Install
Step 2: Create Backup Task
- Open Hyper Backup
- Click + → Data backup task
- Select S3 Storage
Step 3: Configure S3 Connection
# Synology Hyper Backup S3 Configuration
S3 Server: Custom Server URL
Server address: s3.danubedata.com
Signature Version: v4
Request style: Path-style
Access key: YOUR_ACCESS_KEY_ID
Secret key: YOUR_SECRET_ACCESS_KEY
Bucket name: your-bucket-name
Directory: /synology-backup (or your preferred path)
Click Next after the connection test succeeds.
Step 4: Select Shared Folders
Choose which shared folders to back up:
# Recommended folders to back up:
✓ photos - Family photos (critical!)
✓ documents - Important documents
✓ home - User home directories
✓ video - Home videos
✓ music - Music library (if not on streaming)
✓ backup - Computer backups stored on NAS
# Usually skip:
✗ download - Temporary downloads
✗ surveillance - Large video files, usually archived separately
✗ docker - Can be recreated
Step 5: Select Applications (Optional)
You can also back up Synology application settings:
# Useful applications to back up:
✓ Synology Photos configuration
✓ Synology Drive settings
✓ Surveillance Station settings (if used)
✓ Note Station data
Step 6: Configure Backup Settings
# Task Settings
Task name: S3 Cloud Backup
Enable task notification: Yes
Enable transfer encryption: Yes (recommended)
Enable backup schedule: Yes
# Compression
Enable backup data compression: Yes
# Schedule
Daily at 3:00 AM
# Rotation Settings (versioning)
Enable backup rotation: Yes
Keep versions: Smart Recycle (recommended)
or
Keep last 30 versions
Step 7: Run First Backup
- Click Apply
- Click Back up now to start immediately
- Initial backup may take days for large libraries
- Monitor progress in Hyper Backup main window
Synology Cloud Sync (Alternative)
For simple folder synchronization instead of versioned backup:
- Install Cloud Sync from Package Center
- Click + → S3 Storage
- Enter same S3 credentials
- Map local folders to remote folders
- Choose sync direction (Upload only recommended for backup)
QNAP NAS (Hybrid Backup Sync)
QNAP's Hybrid Backup Sync (HBS 3) provides comprehensive backup to S3 storage.
Step 1: Install HBS 3
- Open App Center
- Search for "Hybrid Backup Sync"
- Install HBS 3
Step 2: Create Storage Space
- Open HBS 3
- Go to Storage Spaces
- Click Create → S3 Compatible
- Configure connection:
# QNAP HBS 3 S3 Configuration
Name: DanubeData S3
Access Key: YOUR_ACCESS_KEY_ID
Secret Key: YOUR_SECRET_ACCESS_KEY
Service Endpoint: https://s3.danubedata.com
Signature Version: v4
Region: eu-central-1
Bucket: your-bucket-name
Step 3: Create Backup Job
- Go to Backup & Restore
- Click Create → New Backup Job
- Select S3 Compatible as destination
- Choose your storage space created above
Step 4: Select Source Folders
# Select folders to back up:
✓ /Photos
✓ /Documents
✓ /Video
✓ /Music
✓ /Homes
✓ /Backups
Step 5: Configure Schedule & Policies
# Schedule
Run schedule: Daily at 2:00 AM
# Policies
Enable versioning: Yes
Version limit: Smart versioning
- Keep all versions for: 7 days
- Keep one version per day for: 30 days
- Keep one version per week for: 12 weeks
- Keep one version per month: Forever
# Options
Enable data compression: Yes
Enable encryption: Yes (set password)
Step 6: Run and Monitor
- Click Create
- Click Run to start backup
- Monitor in Overview tab
TrueNAS (Cloud Sync Tasks)
TrueNAS (formerly FreeNAS) includes built-in cloud sync capability.
Step 1: Add Cloud Credentials
- Go to System → Cloud Credentials
- Click Add
- Configure:
# TrueNAS Cloud Credentials
Name: DanubeData S3
Provider: Amazon S3
Access Key ID: YOUR_ACCESS_KEY_ID
Secret Access Key: YOUR_SECRET_ACCESS_KEY
# Advanced Settings
Endpoint URL: https://s3.danubedata.com
Region: eu-central-1
Step 2: Create Cloud Sync Task
- Go to Tasks → Cloud Sync Tasks
- Click Add
- Configure:
# Cloud Sync Task Configuration
Description: NAS Backup to S3
Direction: PUSH (backup to cloud)
Transfer Mode: SYNC (mirror local to remote)
Credential: DanubeData S3 (created above)
Bucket: your-bucket-name
Folder: /truenas-backup
# Directory/Files
Select the dataset(s) to back up
# Schedule
Run schedule: Daily at 03:00
# Advanced Options
Follow Symlinks: No
Pre-Script: (optional - database dumps)
Post-Script: (optional - notifications)
Using Rclone on TrueNAS
For more control, use rclone directly via SSH:
# SSH into TrueNAS
ssh root@truenas-ip
# Configure rclone
rclone config
# Set up S3 remote as shown in other guides
# Create backup script
cat > /root/scripts/s3-backup.sh << 'EOF'
#!/bin/bash
rclone sync /mnt/pool/data danubedata:your-bucket/truenas-backup
--exclude "*.tmp"
--exclude ".Trash*/**"
--log-file /var/log/rclone-backup.log
--log-level INFO
EOF
chmod +x /root/scripts/s3-backup.sh
# Add to crontab
crontab -e
0 3 * * * /root/scripts/s3-backup.sh
Unraid
Unraid can use the Rclone plugin or Duplicati Docker container.
Method 1: Rclone Plugin
- Go to Apps (Community Applications)
- Search for "rclone"
- Install rclone plugin
- Access rclone via terminal:
# Configure rclone
rclone config
# Set up S3 remote
# name> danubedata
# type> s3
# provider> Other
# access_key_id> YOUR_ACCESS_KEY
# secret_access_key> YOUR_SECRET_KEY
# endpoint> https://s3.danubedata.com
# Test connection
rclone ls danubedata:your-bucket
# Create backup script
cat > /boot/config/plugins/user.scripts/scripts/s3-backup/script << 'EOF'
#!/bin/bash
# Unraid S3 Backup Script
SHARES_TO_BACKUP="photos documents media"
BUCKET="your-bucket"
for share in $SHARES_TO_BACKUP; do
echo "Backing up $share..."
rclone sync "/mnt/user/$share" "danubedata:$BUCKET/unraid/$share"
--progress --log-level INFO
done
echo "Backup completed!"
EOF
Method 2: Duplicati Docker Container
- Go to Apps
- Search for "Duplicati"
- Install Duplicati container
- Access Duplicati web UI (usually port 8200)
- Configure S3 backup as shown in Windows guide
What to Back Up from Your NAS
| Content Type | Priority | Typical Size | Notes |
|---|---|---|---|
| Family Photos | Critical | 50GB - 500GB | Irreplaceable! |
| Home Videos | Critical | 100GB - 2TB | Irreplaceable! |
| Documents | Critical | 1GB - 50GB | Tax docs, contracts, etc. |
| Computer Backups | High | 100GB - 1TB | Time Machine, etc. |
| Music Library | Medium | 50GB - 500GB | Often replaceable via streaming |
| Movie Collection | Low | 1TB - 50TB | Usually replaceable |
| Downloads | Skip | Varies | Temporary files |
Best Practices for NAS Backups
1. Prioritize Irreplaceable Data
Back up photos and videos first. Movies can be re-ripped or re-downloaded, but your kids' first steps video cannot be replaced.
2. Enable Versioning
Versioning protects against ransomware and accidental deletion. Even if files get encrypted or deleted, you can restore previous versions.
3. Schedule During Off-Peak Hours
Large initial backups can take days. Schedule them for nights/weekends when NAS usage is low.
4. Use Compression
Enable compression to reduce storage costs and upload time. Most NAS backup tools support this.
5. Monitor Backup Health
Check backup status weekly. Set up email notifications for failures.
6. Test Restores
Periodically restore a few files to verify backups are working correctly.
Bandwidth Considerations
Initial backup of a large NAS can take significant time:
| Data Size | 10 Mbps Upload | 50 Mbps Upload | 100 Mbps Upload |
|---|---|---|---|
| 100 GB | ~24 hours | ~5 hours | ~2.5 hours |
| 500 GB | ~5 days | ~24 hours | ~12 hours |
| 1 TB | ~10 days | ~2 days | ~24 hours |
| 5 TB | ~50 days | ~10 days | ~5 days |
Tips for slow connections:
- Back up critical data (photos) first, then add more folders over time
- Enable bandwidth throttling during work hours
- Be patient—incremental backups after the initial sync are much faster
Cost Estimation
| NAS Data | Monthly Cost | Notes |
|---|---|---|
| Up to 1 TB | €3.99 | Included in base plan |
| 2 TB | ~€8 | +€4 for extra TB |
| 5 TB | ~€20 | Much cheaper than consumer clouds |
| 10 TB | ~€40 | Perfect for media-heavy NAS |
Get Started
Protect your NAS data with off-site cloud backup:
- Create a DanubeData account
- Create a storage bucket
- Generate access keys
- Configure backup in your NAS (Hyper Backup, HBS 3, etc.)
- Start with your photos—they're irreplaceable!
👉 Create Your Backup Bucket - €3.99/month includes 1TB storage
Your NAS contains a lifetime of memories. Don't leave them vulnerable to theft, fire, or ransomware.
Need help setting up NAS backup? Contact our team for personalized assistance.