BlogTutorialsHow to Store CCTV and Security Camera Footage on S3 Storage (2026)

How to Store CCTV and Security Camera Footage on S3 Storage (2026)

Adrian Silaghi
Adrian Silaghi
March 19, 2026
13 min read
3 views
#s3 #cctv #security camera #video surveillance #object storage #nvr #ip camera #hikvision #dahua #frigate #blue iris
How to Store CCTV and Security Camera Footage on S3 Storage (2026)

Security cameras generate massive amounts of video data every single day. A typical 4-camera 1080p setup produces 50-100 GB per day, and a 16-camera 4K installation can easily exceed 1 TB daily. Storing all of this on a local NVR hard drive means you're one disk failure away from losing critical evidence.

In this guide, we'll show you how to use S3-compatible object storage to store your CCTV and security camera footage reliably, affordably, and with automatic retention policies that keep your storage costs under control.

Why Use S3 for Surveillance Footage?

Traditional surveillance storage relies on local hard drives inside an NVR (Network Video Recorder) or a dedicated NAS. This approach has significant limitations:

  • Hard drive failure. Surveillance-rated HDDs (like WD Purple or Seagate SkyHawk) last 3-5 years under 24/7 write loads. When they fail, you lose all recorded footage.
  • Physical theft. Burglars know to steal or destroy the NVR first. If your recording device is taken, your footage goes with it.
  • Limited capacity. Local NVRs have a fixed number of drive bays. Expanding storage means buying new hardware.
  • No offsite backup. Fire, flood, or power surges can destroy both cameras and the NVR simultaneously.
  • Single-site access. Reviewing footage from a remote location requires VPN or port-forwarding hacks that compromise security.

S3 object storage solves all of these problems:

  • Durability: Data is stored redundantly across multiple disks and servers — no single point of failure.
  • Scalability: Storage grows automatically. No need to swap drives or buy new NVR hardware.
  • Offsite by default: Footage lives in a remote data center, safe from physical theft or local disasters.
  • API access: Any S3-compatible tool can retrieve footage from anywhere with an internet connection.
  • Lifecycle policies: Automatically delete footage older than 30, 60, or 90 days to control costs.

Storage Requirements: How Much Space Do You Need?

Before setting up S3 storage, you need to understand how much data your cameras generate. The main factors are:

  • Resolution: Higher resolution = more data per frame
  • Frame rate: More frames per second = smoother video but more storage
  • Codec: H.265 (HEVC) uses roughly 50% less space than H.264 for equivalent quality
  • Recording mode: Continuous recording uses far more storage than motion-triggered recording
  • Scene complexity: Busy scenes (trees blowing, traffic) compress less efficiently

Storage Calculation Table (Continuous Recording, Per Camera, Per Day)

Resolution Frame Rate Codec Bitrate (Mbps) Per Day (GB) Per 30 Days (GB) Per 90 Days (TB)
1080p (2 MP) 15 fps H.264 4 42 1,260 3.7
1080p (2 MP) 15 fps H.265 2 21 630 1.8
1080p (2 MP) 25 fps H.264 6 63 1,890 5.5
1080p (2 MP) 25 fps H.265 3 32 945 2.8
2K (4 MP) 20 fps H.264 8 84 2,520 7.4
2K (4 MP) 20 fps H.265 4 42 1,260 3.7
4K (8 MP) 15 fps H.264 16 168 5,040 14.7
4K (8 MP) 15 fps H.265 8 84 2,520 7.4
4K (8 MP) 25 fps H.264 24 252 7,560 22.1
4K (8 MP) 25 fps H.265 12 126 3,780 11.0

Note: Motion-only recording typically reduces storage by 50-80% depending on scene activity. These figures assume continuous 24/7 recording.

Quick Calculation Formula

# Storage per camera per day (in GB)
storage_gb = bitrate_mbps × 3600 × 24 / 8 / 1024

# Example: 4K H.265 at 8 Mbps
# 8 × 86400 / 8 / 1024 = 84 GB/day

# Total for your system
total_daily = storage_per_camera × number_of_cameras
total_monthly = total_daily × 30

Multi-Camera Storage Requirements

Setup Cameras Daily (GB) 30-Day (TB) 90-Day (TB) Monthly Cost*
Home (1080p H.265, motion-only) 4 17 0.5 1.5 €3.99
Small Business (1080p H.265, continuous) 8 168 5.0 15.1 €19.95
Retail (2K H.265, continuous) 16 672 20.2 60.5 €79.80
Warehouse (4K H.265, continuous) 32 2,688 80.6 241.9 Contact for volume pricing

*Based on DanubeData S3 storage pricing: €3.99/month base includes 1TB storage + 1TB traffic. Additional storage €3.99/TB/month.

Camera Systems That Support S3 Directly

Some modern IP cameras and NVR firmware can upload clips or snapshots directly to S3-compatible storage via built-in cloud upload features. However, most consumer and prosumer systems require an intermediate NVR or VMS (Video Management System) to handle the S3 integration.

Cameras with Cloud/FTP Upload Capabilities

  • Hikvision: Higher-end models support FTP upload on events. Combine with an FTP-to-S3 bridge or use Hikvision NVR with cloud backup plugins.
  • Dahua: Some models support cloud storage for snapshots. Full video recording to S3 requires a VMS.
  • Axis Communications: ACAP (camera apps platform) supports custom S3 upload applications.
  • Reolink: FTP upload support for event clips; pair with rclone for S3 sync.
  • Amcrest: FTP/SFTP upload support for motion clips.

For most setups, the recommended approach is to record locally on an NVR or VMS software, then sync the recordings to S3 using rclone or a built-in integration.

NVR/VMS Software with S3 Support

Frigate NVR (Open Source, Recommended)

Frigate is the most popular open-source NVR for home and small business use. It integrates with Home Assistant, supports hardware-accelerated AI object detection (person, car, animal), and can export recordings to S3.

  • License: Free / Open Source (MIT)
  • Platform: Docker (Linux, any hardware)
  • AI Detection: Google Coral TPU, OpenVINO, ONNX
  • S3 Support: Via external sync (rclone) or custom event handlers
  • Best for: Home users, small offices, Home Assistant users

Blue Iris (Windows)

Blue Iris is the most popular commercial NVR software for Windows. It supports up to 64 cameras and can push recordings to cloud storage.

  • License: $69.95 one-time purchase
  • Platform: Windows 10/11
  • S3 Support: Via scheduled rclone sync or Windows task that uploads new recordings
  • Best for: Windows users, small to medium installations

Milestone XProtect

Milestone XProtect is an enterprise-grade VMS used by large organizations, shopping centers, and government facilities.

  • License: Free (Essential+) up to 8 cameras; paid tiers for more
  • Platform: Windows Server
  • S3 Support: Built-in archive tier can target S3-compatible storage
  • Best for: Enterprise, multi-site deployments

Shinobi (Open Source)

Shinobi is a web-based NVR that supports S3 as a native storage backend.

  • License: Free / Open Source (dual license)
  • Platform: Node.js (Linux, Docker)
  • S3 Support: Native — configure S3 as video storage directly in settings
  • Best for: Self-hosted, web-based management, direct S3 recording

ZoneMinder (Open Source)

ZoneMinder is one of the oldest open-source NVR systems, widely used in Linux environments.

  • License: Free / Open Source (GPL)
  • Platform: Linux
  • S3 Support: Via event storage backend plugins or rclone sync
  • Best for: Linux power users, large camera counts

NVR Software Comparison

Software Cost Platform S3 Integration AI Detection Max Cameras
Frigate Free Docker/Linux Via rclone sync Yes (Coral, ONNX) Unlimited
Blue Iris $69.95 Windows Via rclone sync Yes (DeepStack) 64
Milestone XProtect Free to $$$$ Windows Server Built-in archive Yes (plugins) Unlimited (paid)
Shinobi Free Docker/Node.js Native S3 backend Yes (plugins) Unlimited
ZoneMinder Free Linux Via rclone sync Basic (zmeventnotification) Unlimited

Setting Up Frigate NVR with S3 Storage

Frigate is our recommended NVR for most users. Here's how to set it up with DanubeData S3 storage for offsite archiving.

Step 1: Create Your S3 Bucket

  1. Sign up for DanubeData
  2. Navigate to Object Storage and create a new bucket (e.g., cctv-recordings)
  3. Generate S3 access keys from the Access Keys section
  4. Note your credentials:
    Endpoint: https://s3.danubedata.ro
    Region: fsn1
    Access Key: your-access-key
    Secret Key: your-secret-key
    Bucket: cctv-recordings
    

Step 2: Deploy Frigate with Docker Compose

# docker-compose.yml
version: "3.9"
services:
  frigate:
    container_name: frigate
    restart: unless-stopped
    image: ghcr.io/blakeblackshear/frigate:stable
    shm_size: "256mb"
    volumes:
      - ./config:/config
      - ./storage:/media/frigate
      - /etc/localtime:/etc/localtime:ro
    ports:
      - "5000:5000"   # Web UI
      - "8554:8554"   # RTSP restream
      - "8555:8555"   # WebRTC
    environment:
      - FRIGATE_RTSP_PASSWORD=your-camera-password
    devices:
      - /dev/bus/usb:/dev/bus/usb  # Google Coral USB (optional)

Step 3: Configure Frigate

# config/config.yml
mqtt:
  enabled: false  # Set to true if using Home Assistant

record:
  enabled: true
  retain:
    days: 7          # Keep recordings locally for 7 days
    mode: motion     # Only retain motion events locally
  events:
    retain:
      default: 14    # Keep event clips for 14 days locally

snapshots:
  enabled: true
  retain:
    default: 30      # Keep snapshots for 30 days locally

detect:
  width: 1280
  height: 720
  fps: 5

objects:
  track:
    - person
    - car
    - dog
    - cat

cameras:
  front_door:
    ffmpeg:
      inputs:
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.100:554/stream1
          roles:
            - record
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.100:554/stream2
          roles:
            - detect
    record:
      enabled: true

  backyard:
    ffmpeg:
      inputs:
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.101:554/stream1
          roles:
            - record
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.101:554/stream2
          roles:
            - detect
    record:
      enabled: true

  garage:
    ffmpeg:
      inputs:
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.102:554/stream1
          roles:
            - record
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.102:554/stream2
          roles:
            - detect
    record:
      enabled: true

  driveway:
    ffmpeg:
      inputs:
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.103:554/stream1
          roles:
            - record
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@192.168.1.103:554/stream2
          roles:
            - detect
    record:
      enabled: true

Step 4: Set Up Rclone to Sync Frigate Recordings to S3

Frigate stores recordings in a structured directory format. We'll use rclone to sync these to S3 on a schedule.

# Install rclone
curl https://rclone.org/install.sh | sudo bash

# Configure the S3 remote
rclone config

# Interactive setup:
# n) New remote
# name> danubedata
# Storage> s3
# provider> Other
# env_auth> false
# access_key_id> YOUR_ACCESS_KEY
# secret_access_key> YOUR_SECRET_KEY
# region> fsn1
# endpoint> https://s3.danubedata.ro
# acl> private
# Edit advanced config?> n
# Keep this remote?> y

Step 5: Create the Sync Script

#!/bin/bash
# sync-frigate-s3.sh - Sync Frigate recordings to S3
# Run via cron every hour

FRIGATE_STORAGE="/path/to/frigate/storage"
S3_REMOTE="danubedata"
S3_BUCKET="cctv-recordings"
LOG_FILE="/var/log/frigate-s3-sync.log"

echo "[$(date)] Starting Frigate S3 sync..." >> "$LOG_FILE"

# Sync recordings (video clips)
rclone sync "$FRIGATE_STORAGE/recordings/" 
    "$S3_REMOTE:$S3_BUCKET/recordings/" 
    --transfers 8 
    --checkers 16 
    --s3-upload-concurrency 4 
    --s3-chunk-size 64M 
    --log-file="$LOG_FILE" 
    --log-level INFO 
    --exclude "*.tmp" 
    --exclude ".in_progress"

# Sync event clips
rclone sync "$FRIGATE_STORAGE/clips/" 
    "$S3_REMOTE:$S3_BUCKET/clips/" 
    --transfers 8 
    --log-file="$LOG_FILE" 
    --log-level INFO

# Sync snapshots (much smaller)
rclone sync "$FRIGATE_STORAGE/snapshots/" 
    "$S3_REMOTE:$S3_BUCKET/snapshots/" 
    --transfers 8 
    --log-file="$LOG_FILE" 
    --log-level INFO

echo "[$(date)] Frigate S3 sync completed." >> "$LOG_FILE"

# Report storage usage
rclone size "$S3_REMOTE:$S3_BUCKET" >> "$LOG_FILE" 2>&1
# Make executable and add to cron
chmod +x /opt/scripts/sync-frigate-s3.sh

# Run every hour
crontab -e
# Add:
0 * * * * /opt/scripts/sync-frigate-s3.sh

Using Rclone to Sync Any NVR to S3

Whether you use Blue Iris, ZoneMinder, or a hardware NVR that writes to a network share, rclone can sync the recordings to S3.

Generic NVR Sync Script

#!/bin/bash
# nvr-s3-sync.sh - Generic NVR recording sync to S3

# Configuration
NVR_RECORDING_PATH="/mnt/nvr/recordings"   # Path to NVR recordings
S3_REMOTE="danubedata"
S3_BUCKET="cctv-recordings"
RETENTION_DAYS=90                           # Keep files on S3 for 90 days
LOG_FILE="/var/log/nvr-s3-sync.log"
BANDWIDTH_LIMIT="50M"                       # Limit upload to 50 Mbps

echo "[$(date)] Starting NVR S3 sync..." >> "$LOG_FILE"

# Sync recordings to S3
# --copy-links: follow symlinks
# --bwlimit: limit bandwidth to avoid saturating upload
# --min-age: skip files being written (less than 5 minutes old)
rclone sync "$NVR_RECORDING_PATH" 
    "$S3_REMOTE:$S3_BUCKET/recordings/" 
    --transfers 4 
    --checkers 8 
    --bwlimit "$BANDWIDTH_LIMIT" 
    --min-age 5m 
    --copy-links 
    --exclude "*.tmp" 
    --exclude "*.lock" 
    --log-file="$LOG_FILE" 
    --log-level INFO

echo "[$(date)] Sync completed." >> "$LOG_FILE"

Blue Iris Sync (Windows)

:: sync-blueiris-s3.bat - Sync Blue Iris recordings to S3
:: Run via Windows Task Scheduler every hour

@echo off
set LOG_FILE=C:BlueIrisLogss3-sync.log

echo [%date% %time%] Starting Blue Iris S3 sync... >> %LOG_FILE%

:: Blue Iris stores recordings in: C:BlueIrisClips and C:BlueIrisNew
rclone sync "C:BlueIrisNew" ^
    danubedata:cctv-recordings/blueiris/ ^
    --transfers 4 ^
    --bwlimit 50M ^
    --min-age 5m ^
    --log-file %LOG_FILE% ^
    --log-level INFO

echo [%date% %time%] Sync completed. >> %LOG_FILE%

Lifecycle Policies for Automatic Cleanup

Surveillance footage has a limited useful life. You don't need to keep routine recordings forever — in most cases, 30-90 days is sufficient. S3 lifecycle policies automate this cleanup.

Setting Up Lifecycle Policies with AWS CLI

# Install the AWS CLI
pip install awscli

# Configure for DanubeData S3
aws configure --profile danubedata
# AWS Access Key ID: YOUR_ACCESS_KEY
# AWS Secret Access Key: YOUR_SECRET_KEY
# Default region: fsn1
# Default output format: json

30-Day Retention Policy

# lifecycle-30day.json
{
    "Rules": [
        {
            "ID": "Delete recordings after 30 days",
            "Filter": {
                "Prefix": "recordings/"
            },
            "Status": "Enabled",
            "Expiration": {
                "Days": 30
            }
        },
        {
            "ID": "Delete snapshots after 60 days",
            "Filter": {
                "Prefix": "snapshots/"
            },
            "Status": "Enabled",
            "Expiration": {
                "Days": 60
            }
        },
        {
            "ID": "Delete event clips after 90 days",
            "Filter": {
                "Prefix": "clips/"
            },
            "Status": "Enabled",
            "Expiration": {
                "Days": 90
            }
        }
    ]
}

# Apply the lifecycle policy
aws s3api put-bucket-lifecycle-configuration 
    --bucket cctv-recordings 
    --lifecycle-configuration file://lifecycle-30day.json 
    --endpoint-url https://s3.danubedata.ro 
    --profile danubedata

90-Day Retention Policy (GDPR-Compliant Maximum)

# lifecycle-90day.json
{
    "Rules": [
        {
            "ID": "Delete all footage after 90 days",
            "Filter": {
                "Prefix": ""
            },
            "Status": "Enabled",
            "Expiration": {
                "Days": 90
            }
        }
    ]
}

# Apply
aws s3api put-bucket-lifecycle-configuration 
    --bucket cctv-recordings 
    --lifecycle-configuration file://lifecycle-90day.json 
    --endpoint-url https://s3.danubedata.ro 
    --profile danubedata

Recommended Retention Periods

Use Case Retention Reason
Home security 30 days Most incidents are noticed within days
Retail / shop 30-60 days Shoplifting and fraud investigation window
Office / workplace 30-90 days HR and compliance requirements
Warehouse / logistics 60-90 days Shipping dispute window
Financial institution 90-180 days Regulatory requirements
Government / public safety 90-365 days Legal and investigative requirements

Bandwidth Considerations

Uploading surveillance footage to S3 requires sufficient upload bandwidth. Here's what you need:

Upload Speed Requirements

Camera Setup Total Bitrate Min Upload Speed Recommended Upload
4x 1080p H.265 (continuous) 8 Mbps 10 Mbps 20 Mbps
8x 1080p H.265 (continuous) 16 Mbps 20 Mbps 50 Mbps
16x 2K H.265 (continuous) 64 Mbps 80 Mbps 100 Mbps
32x 4K H.265 (continuous) 256 Mbps 300 Mbps 500 Mbps

Important: If you don't have enough upload bandwidth for real-time cloud recording, use the hybrid approach (described below) — record locally and sync to S3 during off-peak hours.

Bandwidth Optimization Tips

  • Use H.265 (HEVC): Cuts bandwidth in half compared to H.264 with identical quality.
  • Motion-only recording: Reduces data volume by 50-80%.
  • Dual-stream: Record high-resolution locally, upload only the sub-stream (lower resolution) to S3.
  • Schedule sync: Upload during off-peak hours (midnight to 6 AM) when bandwidth is unused.
  • rclone --bwlimit: Throttle uploads to avoid impacting other network traffic.

Cost Comparison: Local NAS vs. S3 for Surveillance

Factor Local NAS (Synology/QNAP) DanubeData S3
Upfront Cost €400-800 (NAS) + €200-400 (drives) €0
Monthly Cost (5 TB) €5-10 (electricity) €19.95
Drive Replacement (every 3-5 yr) €100-200 per drive Included
Theft Protection None — can be stolen Offsite by default
Disaster Protection None — same location Offsite data center
Redundancy RAID (if configured) Built-in replication
Scalability Limited by drive bays Unlimited
Remote Access VPN or port-forward API access from anywhere
Maintenance Updates, drive health, UPS Fully managed
3-Year Total (5 TB) €900-1,500+ €718

The best approach is a hybrid setup: keep a small local NVR for immediate playback and use S3 as your offsite backup and long-term archive.

GDPR and Privacy Compliance for Surveillance Footage in the EU

If you operate security cameras in the European Union, you must comply with GDPR. Video surveillance captures personal data (people's images), and storing it on cloud services requires careful consideration.

Key GDPR Requirements for CCTV

  • Lawful basis: You need a legitimate interest (security) and must document it via a Data Protection Impact Assessment (DPIA).
  • Signage: Visible signs must inform people they are being recorded, including your identity and the purpose.
  • Data minimization: Only record areas you need to secure — don't film public sidewalks or neighboring properties unnecessarily.
  • Retention limits: Keep footage only as long as necessary. Most EU data protection authorities recommend 72 hours to 30 days maximum for routine recordings.
  • Access rights: Individuals have the right to request footage containing their image (Subject Access Request).
  • Data processing agreement: If you use a cloud provider to store footage, you need a DPA with them.
  • Data location: Footage should be stored within the EU/EEA. DanubeData stores all data in Germany (Falkenstein), fully GDPR-compliant.

Why DanubeData is GDPR-Compliant for Surveillance

  • Data center location: Falkenstein, Germany (EU)
  • No data transfers outside EU: All storage and processing stays in Germany
  • Encryption at rest: Your footage is encrypted on disk
  • Encryption in transit: All API calls use HTTPS/TLS
  • Access controls: Per-bucket access keys with granular permissions
  • Lifecycle policies: Automatic deletion after your configured retention period
  • Audit trail: API access logs for compliance reporting

Retrieval and Playback from S3

When you need to review footage stored on S3, there are several approaches depending on your needs.

Download and Play Locally

# Download a specific day's recordings
rclone copy "danubedata:cctv-recordings/recordings/2026/03/15/" 
    /tmp/footage/2026-03-15/ 
    --progress

# Download recordings from a specific camera
rclone copy "danubedata:cctv-recordings/recordings/2026/03/15/front_door/" 
    /tmp/footage/front-door/ 
    --progress

# Download a specific time window (filter by filename pattern)
rclone copy "danubedata:cctv-recordings/recordings/2026/03/15/front_door/" 
    /tmp/footage/front-door/ 
    --include "14-*" 
    --include "15-*" 
    --progress
# Downloads only files from 14:00 and 15:00

Stream Directly with VLC

For quick review, you can generate a pre-signed URL and open it in VLC or any media player:

# Generate a pre-signed URL (valid for 1 hour)
aws s3 presign 
    s3://cctv-recordings/recordings/2026/03/15/front_door/14-30-00.mp4 
    --expires-in 3600 
    --endpoint-url https://s3.danubedata.ro 
    --profile danubedata

# Open in VLC
vlc "https://s3.danubedata.ro/cctv-recordings/recordings/2026/03/15/front_door/14-30-00.mp4?X-Amz-..."

Browse with a Desktop S3 Client

GUI tools like Cyberduck (Mac/Windows, free), S3 Browser (Windows, free), or Mountain Duck (Mac/Windows) let you browse your surveillance footage like a file system and double-click to play video files.

Hybrid Approach: Local Cache + S3 Archive

The most practical architecture for surveillance combines local and cloud storage:

                                    +-----------------------+
                                    |   DanubeData S3       |
                                    |   (Long-term archive) |
                                    |   90-day retention    |
                                    +-----------^-----------+
                                                |
                                          rclone sync
                                          (every hour)
                                                |
+-----------+     RTSP      +----------+    +---+---+
| IP Camera |-------------->| Frigate  |--->| Local |
| (x4-16)   |              | NVR      |    | Disk  |
+-----------+              +----------+    | 7-day |
                                           +-------+

How It Works

  1. Cameras record to Frigate NVR on local storage (SSD or HDD).
  2. Frigate retains 7 days locally for fast access and real-time review.
  3. rclone syncs hourly to S3, uploading new recordings to DanubeData.
  4. S3 lifecycle policy deletes after 90 days (or your chosen retention period).
  5. Local disk auto-cleans after 7 days via Frigate's built-in retention settings.

Benefits of the Hybrid Approach

  • Fast local playback: Recent footage is available instantly without downloading from S3.
  • Offsite protection: Even if the NVR is stolen or destroyed, the last sync to S3 is safe.
  • Bandwidth-friendly: You don't need real-time upload bandwidth — rclone syncs on a schedule.
  • Cost-effective: Local disk handles the hot storage; S3 handles the cold archive.

Complete Hybrid Setup Script

#!/bin/bash
# hybrid-surveillance-sync.sh
# Syncs local NVR recordings to S3 and manages local cleanup

S3_REMOTE="danubedata"
S3_BUCKET="cctv-recordings"
LOCAL_RECORDINGS="/media/frigate/recordings"
LOCAL_CLIPS="/media/frigate/clips"
LOCAL_SNAPSHOTS="/media/frigate/snapshots"
LOG="/var/log/surveillance-sync.log"
MAX_LOCAL_DAYS=7
BANDWIDTH="50M"

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" >> "$LOG"
}

log "=== Starting hybrid surveillance sync ==="

# 1. Sync recordings to S3
log "Syncing recordings..."
rclone copy "$LOCAL_RECORDINGS" 
    "$S3_REMOTE:$S3_BUCKET/recordings/" 
    --transfers 8 
    --bwlimit "$BANDWIDTH" 
    --min-age 5m 
    --log-file "$LOG" 
    --log-level INFO 
    --stats-one-line

# 2. Sync event clips to S3
log "Syncing event clips..."
rclone copy "$LOCAL_CLIPS" 
    "$S3_REMOTE:$S3_BUCKET/clips/" 
    --transfers 4 
    --bwlimit "$BANDWIDTH" 
    --log-file "$LOG" 
    --log-level INFO

# 3. Sync snapshots to S3
log "Syncing snapshots..."
rclone copy "$LOCAL_SNAPSHOTS" 
    "$S3_REMOTE:$S3_BUCKET/snapshots/" 
    --transfers 4 
    --bwlimit "$BANDWIDTH" 
    --log-file "$LOG" 
    --log-level INFO

# 4. Clean up local recordings older than MAX_LOCAL_DAYS
log "Cleaning local recordings older than $MAX_LOCAL_DAYS days..."
find "$LOCAL_RECORDINGS" -type f -mtime +$MAX_LOCAL_DAYS -delete 2>> "$LOG"
find "$LOCAL_RECORDINGS" -type d -empty -delete 2>> "$LOG"

# 5. Report
LOCAL_SIZE=$(du -sh "$LOCAL_RECORDINGS" 2>/dev/null | cut -f1)
S3_SIZE=$(rclone size "$S3_REMOTE:$S3_BUCKET" --json 2>/dev/null | python3 -c "import sys,json; d=json.load(sys.stdin); print(f'{d["bytes"]/1073741824:.1f} GB')" 2>/dev/null || echo "unknown")

log "Local storage: $LOCAL_SIZE"
log "S3 storage: $S3_SIZE"
log "=== Sync completed ==="

Setting Up Shinobi with Native S3 Storage

If you prefer a VMS that writes directly to S3 without requiring rclone, Shinobi supports S3 as a native storage backend.

# docker-compose.yml for Shinobi with S3
version: "3.9"
services:
  shinobi:
    image: shinobisystems/shinobi:latest
    container_name: shinobi
    restart: unless-stopped
    ports:
      - "8080:8080"
    environment:
      - DB_TYPE=sqlite
    volumes:
      - ./config:/config
      - ./customAutoLoad:/home/Shinobi/libs/customAutoLoad

In the Shinobi web UI, navigate to Settings > Storage and configure:

{
    "type": "s3",
    "s3": {
        "endpoint": "https://s3.danubedata.ro",
        "region": "fsn1",
        "bucket": "cctv-recordings",
        "accessKeyId": "YOUR_ACCESS_KEY",
        "secretAccessKey": "YOUR_SECRET_KEY",
        "forcePathStyle": true
    }
}

With this configuration, Shinobi writes all recordings directly to S3 — no local storage or sync scripts needed.

Monitoring Your S3 Storage Usage

Keep track of your surveillance storage to avoid surprises on your bill:

#!/bin/bash
# surveillance-storage-report.sh
# Run weekly via cron for a storage usage report

S3_REMOTE="danubedata"
S3_BUCKET="cctv-recordings"

echo "=============================="
echo "Surveillance Storage Report"
echo "Date: $(date)"
echo "=============================="
echo ""

echo "Total S3 Usage:"
rclone size "$S3_REMOTE:$S3_BUCKET"
echo ""

echo "Usage by directory:"
for dir in recordings clips snapshots; do
    echo "  $dir:"
    rclone size "$S3_REMOTE:$S3_BUCKET/$dir" 2>/dev/null || echo "    (empty)"
done
echo ""

echo "Oldest recording:"
rclone lsl "$S3_REMOTE:$S3_BUCKET/recordings/" --max-depth 1 | head -1
echo ""

echo "Newest recording:"
rclone lsl "$S3_REMOTE:$S3_BUCKET/recordings/" --max-depth 1 | tail -1
echo ""

echo "Estimated monthly cost:"
BYTES=$(rclone size "$S3_REMOTE:$S3_BUCKET" --json 2>/dev/null | python3 -c "import sys,json; print(json.load(sys.stdin)['bytes'])" 2>/dev/null || echo 0)
TB=$(echo "scale=2; $BYTES / 1099511627776" | bc 2>/dev/null || echo "0")
echo "  Storage: ${TB} TB"
echo "  Base cost: 3.99 EUR (includes 1 TB)"

Troubleshooting Common Issues

Upload Too Slow

  • Check your internet upload speed with speedtest-cli
  • Reduce camera bitrate by switching to H.265
  • Enable motion-only recording to reduce data volume
  • Use rclone --transfers 16 to parallelize uploads

Files Being Skipped

  • Ensure --min-age 5m is set so rclone skips files still being written
  • Check that your NVR closes files properly (some NVRs keep files open for long periods)

Permission Denied Errors

  • Verify your S3 access key and secret key
  • Ensure the bucket exists and your key has write permissions
  • Check endpoint URL: https://s3.danubedata.ro

High S3 Costs

  • Review lifecycle policies — ensure old footage is being deleted automatically
  • Switch cameras to H.265 codec to halve storage requirements
  • Use motion-only recording instead of continuous
  • Consider uploading only event clips (not full recordings) to S3

Get Started with S3 Surveillance Storage

Protecting your security footage with offsite S3 storage is one of the smartest investments you can make. Whether you run a 4-camera home system or a 64-camera commercial installation, DanubeData S3 storage gives you reliable, affordable, GDPR-compliant offsite backup.

  1. Create a DanubeData account
  2. Create a storage bucket for your surveillance footage
  3. Generate S3 access keys
  4. Set up rclone or configure your NVR's native S3 support
  5. Configure lifecycle policies for automatic retention management

DanubeData S3 Storage for Surveillance:

  • €3.99/month includes 1TB storage + 1TB traffic
  • Additional storage just €3.99/TB/month
  • No egress fees for normal usage
  • GDPR compliant (German data center in Falkenstein)
  • Automatic lifecycle policies for retention management
  • S3-compatible API — works with any NVR or VMS software

Create Your Surveillance Storage Bucket Now

Need help setting up surveillance footage archival? Contact our team — we'll help you design the right storage architecture for your camera system.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.