BlogTutorialsLightroom Cloud Workflow: How to Edit Anywhere with S3 Storage (2025)

Lightroom Cloud Workflow: How to Edit Anywhere with S3 Storage (2025)

Adrian Silaghi
Adrian Silaghi
January 17, 2026
16 min read
9 views
#photography #lightroom #lightroom classic #cloud storage #s3 #workflow #smart previews #photo editing #rclone #backup
Lightroom Cloud Workflow: How to Edit Anywhere with S3 Storage (2025)

Adobe's cloud storage is expensive and limited. At €11.89/month, you only get 1TB—barely enough for a year of professional shooting. But what if you could build your own cloud workflow with unlimited storage at a fraction of the cost?

This guide shows you how to create a hybrid Lightroom workflow that combines the speed of local editing with the safety and accessibility of S3 cloud storage.

The Problem with Adobe's Cloud Storage

Issue Adobe Creative Cloud S3-Based Workflow
Storage Limit 1TB max (Photography plan) Unlimited
Cost per TB €11.89/TB/month €3.99/TB/month
RAW File Support Converts to DNG in Lightroom CC Keep original RAW format
Vendor Lock-in Adobe ecosystem only Standard S3 API—works with anything
Data Ownership Adobe's servers, Adobe's terms Your data, your control
Backup Flexibility Adobe's backup schedule only Any backup tool/schedule

Understanding Lightroom's Architecture

Before building your workflow, understand how Lightroom stores data:

Lightroom Classic (Desktop)

  • Catalog file (.lrcat): Database containing all edits, keywords, collections, metadata
  • RAW files: Your original photos (stored wherever you choose)
  • Previews: Generated thumbnails for fast browsing (can be regenerated)
  • Smart Previews: Smaller DNG files for offline editing (can be regenerated)

Key Insight

Your edits are non-destructive and stored in the catalog, not the RAW files. This means:

  1. RAW files can live on slow/cloud storage
  2. Only the catalog needs to be fast/local
  3. Smart Previews enable editing without original files

Workflow 1: Smart Preview Cloud Editing

This workflow lets you edit from anywhere using Smart Previews, while keeping full RAW files safe in S3.

How It Works

  1. Import RAW files to local drive
  2. Generate Smart Previews in Lightroom
  3. Sync RAW files to S3 (background process)
  4. Edit using Smart Previews on laptop/travel
  5. RAW files available when you need full resolution

Setup

Step 1: Configure Your Folder Structure

# Recommended folder structure
/Pictures/
├── Lightroom/
│   ├── Lightroom Catalog.lrcat          # Keep local (fast SSD)
│   ├── Lightroom Catalog.lrcat-data/    # Keep local
│   └── Lightroom Catalog Previews.lrdata/  # Keep local (regenerable)
│
├── RAW/                                  # Synced to S3
│   └── 2025/
│       ├── 01-15-Johnson-Wedding/
│       ├── 01-22-Corporate-Headshots/
│       └── 02-01-Landscape-Trip/
│
└── Exports/                              # Synced to S3
    └── 2025/
        └── Johnson-Wedding-Delivery/

Step 2: Enable Smart Previews on Import

  1. Open Lightroom Classic
  2. Go to Edit → Preferences → Performance (Mac: Lightroom → Preferences)
  3. Under Import, check "Build Smart Previews during import"

Or manually build for existing photos:

  1. Select photos in Library
  2. Go to Library → Previews → Build Smart Previews

Step 3: Set Up S3 Sync for RAW Folder

#!/bin/bash
# smart-preview-workflow.sh
# Syncs RAW files to S3 while keeping Smart Previews local

RAW_FOLDER="$HOME/Pictures/RAW"
BUCKET="photography-archive"
REMOTE="danubedata"

echo "Syncing RAW files to S3..."
rclone sync "$RAW_FOLDER" "$REMOTE:$BUCKET/raw" 
    --progress 
    --transfers 8 
    --exclude ".DS_Store" 
    --exclude "*.lrprev"

echo "Sync complete. Local Smart Previews remain for offline editing."

Step 4: Edit with Smart Previews

When traveling or on a laptop without your RAW files:

  1. Open your Lightroom catalog (synced via Dropbox/iCloud or on laptop)
  2. Lightroom shows "Photo is missing" but Smart Preview is available
  3. Click "Use Smart Preview" or just start editing
  4. All edits are saved to the catalog
  5. When back at your main workstation with RAW files, edits apply to full resolution

Restoring RAW Files When Needed

# Download specific shoot for high-res export
rclone sync "danubedata:photography-archive/raw/2025/01-15-Johnson-Wedding" 
    "$HOME/Pictures/RAW/2025/01-15-Johnson-Wedding" 
    --progress

# Lightroom will automatically reconnect the files
# Then export at full resolution

Workflow 2: Network Drive with S3 Backend

For photographers with fast internet, mount S3 as a network drive and edit directly from the cloud.

Tools Required

  • macOS: Mountain Duck or rclone mount
  • Windows: Mountain Duck, Air Live Drive, or rclone mount
  • Linux: s3fs or rclone mount

Option A: Mountain Duck (GUI, Paid)

Mountain Duck ($39) creates a native drive from S3:

  1. Install Mountain Duck
  2. Click +S3 (HTTP)
  3. Configure:
    Server: s3.danubedata.com
    Port: 443
    Access Key ID: YOUR_KEY
    Secret Access Key: YOUR_SECRET
    Path: /your-bucket
    
  4. Click Connect
  5. Your S3 bucket appears as a drive letter/mount point

Important: Enable "Smart Synchronization" for best performance with large RAW files.

Option B: Rclone Mount (Free)

# macOS/Linux: Mount S3 as a folder
mkdir -p ~/S3-Photos
rclone mount danubedata:photography-archive ~/S3-Photos 
    --vfs-cache-mode full 
    --vfs-cache-max-size 50G 
    --vfs-read-ahead 128M 
    --buffer-size 64M 
    --dir-cache-time 24h 
    --allow-other &

# Windows: Mount as drive letter
rclone mount danubedata:photography-archive X: ^
    --vfs-cache-mode full ^
    --vfs-cache-max-size 50G

Using with Lightroom

  1. Import photos from the mounted S3 drive
  2. Keep catalog on local SSD (critical for performance)
  3. Lightroom reads RAW files from S3 as needed
  4. Local VFS cache speeds up repeated access

Performance Note: This works well with good internet (100+ Mbps). For slower connections, use the Smart Preview workflow instead.

Workflow 3: Automatic Post-Session Backup

For photographers who want the simplest setup: edit normally, backup automatically.

Setup Folder Watching

Use fswatch (macOS) or similar to detect new photos and trigger backup:

#!/bin/bash
# auto-backup-watcher.sh
# Watches for new photos and backs them up automatically

WATCH_DIR="$HOME/Pictures/RAW"
BUCKET="photography-archive"
LOG="$HOME/Library/Logs/photo-backup.log"

# Install fswatch if needed: brew install fswatch

fswatch -0 -r "$WATCH_DIR" | while read -d "" event; do
    # Debounce: wait for copying to complete
    sleep 5

    # Get folder of changed file
    folder=$(dirname "$event")
    folder_name=$(echo "$folder" | sed "s|$WATCH_DIR/||")

    echo "[$(date)] Change detected in: $folder_name" >> "$LOG"

    # Sync the changed folder
    rclone sync "$folder" "danubedata:$BUCKET/raw/$folder_name" 
        --exclude ".DS_Store" 
        >> "$LOG" 2>&1

    echo "[$(date)] Backup complete: $folder_name" >> "$LOG"

    # Notification
    osascript -e "display notification "$folder_name backed up to cloud" with title "Photo Backup""
done

Make it Run at Startup

# Create Launch Agent
cat > ~/Library/LaunchAgents/com.photo.watcher.plist << 'EOF'
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.photo.watcher</string>
    <key>ProgramArguments</key>
    <array>
        <string>/bin/bash</string>
        <string>/Users/YOUR_USER/auto-backup-watcher.sh</string>
    </array>
    <key>RunAtLoad</key>
    <true/>
    <key>KeepAlive</key>
    <true/>
</dict>
</plist>
EOF

# Load it
launchctl load ~/Library/LaunchAgents/com.photo.watcher.plist

Workflow 4: Multi-Editor Collaboration

For teams with multiple editors (wedding photographers, studios), S3 enables collaboration.

Architecture

         ┌─────────────────┐
         │   S3 Storage    │
         │  (RAW + Exports)│
         └────────┬────────┘
                  │
    ┌─────────────┼─────────────┐
    │             │             │
    ▼             ▼             ▼
┌────────┐   ┌────────┐   ┌────────┐
│Editor 1│   │Editor 2│   │Editor 3│
│(Culling)│  │(Color) │   │(Retouch)│
└────────┘   └────────┘   └────────┘
    │             │             │
    ▼             ▼             ▼
 Local LR      Local LR      Local LR
 Catalog       Catalog        Catalog

Setup

1. Shared S3 Bucket Structure

/studio-photos/
├── raw/                    # Original RAW files (read for all)
│   └── 2025/
│       └── johnson-wedding/
│
├── working/                # Editors download sections here
│   ├── editor1/
│   ├── editor2/
│   └── editor3/
│
├── xmp/                    # Shared edit settings
│   └── johnson-wedding/
│       ├── curated.xmp     # After culling
│       └── color-graded.xmp # After color
│
└── exports/                # Final deliverables
    └── johnson-wedding/

2. Editor Download Scripts

#!/bin/bash
# editor-download.sh - Download shoot for editing

SHOOT="$1"  # e.g., "johnson-wedding"
EDITOR="$2" # e.g., "editor1"
YEAR=$(date +%Y)

if [ -z "$SHOOT" ] || [ -z "$EDITOR" ]; then
    echo "Usage: ./editor-download.sh shoot-name editor-name"
    exit 1
fi

echo "Downloading $SHOOT for $EDITOR..."

# Download RAW files
rclone sync "danubedata:studio-photos/raw/$YEAR/$SHOOT" 
    "$HOME/Pictures/Working/$SHOOT/RAW" 
    --progress

# Download any existing XMP sidecars
rclone sync "danubedata:studio-photos/xmp/$SHOOT" 
    "$HOME/Pictures/Working/$SHOOT/XMP" 
    --progress 2>/dev/null || echo "No existing XMP files"

echo "Ready to import into Lightroom!"
echo "Folder: $HOME/Pictures/Working/$SHOOT"

3. Export XMP Sidecars for Sharing

After editing in Lightroom:

  1. Select edited photos
  2. Metadata → Save Metadata to Files (Ctrl/Cmd + S)
  3. This creates .xmp sidecar files with your edits
#!/bin/bash
# share-edits.sh - Upload XMP sidecars for other editors

SHOOT="$1"
STAGE="$2"  # e.g., "culled", "color-graded", "final"

# Upload XMP files
rclone sync "$HOME/Pictures/Working/$SHOOT" 
    "danubedata:studio-photos/xmp/$SHOOT/$STAGE" 
    --include "*.xmp" 
    --progress

echo "Edits shared! Other editors can download the $STAGE XMP files."

4. Import Another Editor's Work

#!/bin/bash
# import-edits.sh - Download another editor's XMP sidecars

SHOOT="$1"
STAGE="$2"

# Download XMP files
rclone sync "danubedata:studio-photos/xmp/$SHOOT/$STAGE" 
    "$HOME/Pictures/Working/$SHOOT/RAW" 
    --include "*.xmp" 
    --progress

echo "XMP files downloaded. In Lightroom:"
echo "1. Select the photos"
echo "2. Metadata → Read Metadata from Files"
echo "Edits will be applied!"

Backing Up Your Lightroom Catalog

Your catalog contains all your work. Back it up properly.

Automated Catalog Backup Script

#!/bin/bash
# backup-lightroom-catalog.sh

CATALOG_DIR="$HOME/Pictures/Lightroom"
CATALOG_NAME="Lightroom Catalog"
BUCKET="photography-backup"

# Ensure Lightroom is closed
if pgrep -x "Adobe Lightroom Classic" > /dev/null; then
    echo "ERROR: Please close Lightroom before backing up the catalog."
    osascript -e 'display alert "Lightroom Backup" message "Please close Lightroom first!"'
    exit 1
fi

echo "Backing up Lightroom catalog..."

# Backup catalog file (the important one!)
rclone copy "$CATALOG_DIR/$CATALOG_NAME.lrcat" 
    "danubedata:$BUCKET/lightroom-catalog/$(date +%Y-%m-%d)" 
    --progress

# Backup catalog data folder (Lightroom 11+)
if [ -d "$CATALOG_DIR/$CATALOG_NAME.lrcat-data" ]; then
    rclone sync "$CATALOG_DIR/$CATALOG_NAME.lrcat-data" 
        "danubedata:$BUCKET/lightroom-catalog/$(date +%Y-%m-%d)/lrcat-data" 
        --progress
fi

# Keep last 30 days of catalog backups
rclone delete "danubedata:$BUCKET/lightroom-catalog" 
    --min-age 30d

echo "Catalog backup complete!"

# Notification
osascript -e 'display notification "Lightroom catalog backed up to cloud" with title "Backup Complete"'

Schedule Weekly Catalog Backups

# Run every Sunday at 2 AM
cat > ~/Library/LaunchAgents/com.lightroom.catalog.backup.plist << 'EOF'
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.lightroom.catalog.backup</string>
    <key>ProgramArguments</key>
    <array>
        <string>/bin/bash</string>
        <string>/Users/YOUR_USER/backup-lightroom-catalog.sh</string>
    </array>
    <key>StartCalendarInterval</key>
    <dict>
        <key>Weekday</key>
        <integer>0</integer>
        <key>Hour</key>
        <integer>2</integer>
    </dict>
</dict>
</plist>
EOF

launchctl load ~/Library/LaunchAgents/com.lightroom.catalog.backup.plist

Syncing Presets and Profiles

Your presets and profiles are valuable. Keep them safe and synced across machines.

#!/bin/bash
# sync-presets.sh - Backup and sync Lightroom presets

BUCKET="photography-backup"

# macOS preset locations
DEVELOP_PRESETS="$HOME/Library/Application Support/Adobe/CameraRaw/Settings"
LR_PRESETS="$HOME/Library/Application Support/Adobe/Lightroom/Develop Presets"
PROFILES="$HOME/Library/Application Support/Adobe/CameraRaw/CameraProfiles"

echo "Syncing Lightroom presets..."

# Backup presets
[ -d "$DEVELOP_PRESETS" ] && rclone sync "$DEVELOP_PRESETS" "danubedata:$BUCKET/presets/camera-raw-settings"
[ -d "$LR_PRESETS" ] && rclone sync "$LR_PRESETS" "danubedata:$BUCKET/presets/develop-presets"
[ -d "$PROFILES" ] && rclone sync "$PROFILES" "danubedata:$BUCKET/presets/camera-profiles"

echo "Presets synced!"

Restore Presets to New Machine

# Download presets to new machine
rclone sync "danubedata:photography-backup/presets/camera-raw-settings" 
    "$HOME/Library/Application Support/Adobe/CameraRaw/Settings"

rclone sync "danubedata:photography-backup/presets/develop-presets" 
    "$HOME/Library/Application Support/Adobe/Lightroom/Develop Presets"

# Restart Lightroom to load presets

Performance Optimization

Lightroom Settings for Cloud Workflows

  1. Generate Smart Previews: Always (for offline editing)
  2. Standard Preview Size: Match your monitor resolution
  3. Preview Quality: Medium (saves space)
  4. Automatically discard 1:1 Previews: After 30 days

Rclone Performance Tuning

# For large RAW file transfers
rclone sync /source /dest 
    --transfers 16            # Parallel transfers
    --checkers 32             # Parallel checksums
    --buffer-size 64M         # Memory buffer per transfer
    --drive-chunk-size 64M    # Upload chunk size
    --fast-list               # Use fewer API calls
    --progress

Cost Analysis: Adobe vs. S3 Workflow

Let's compare costs for a professional photographer with 10TB of photos:

Scenario Adobe Photography Plan S3 Workflow
Base Software €11.89/mo (includes LR + PS) €11.89/mo (same plan)
10TB Storage Not available (1TB max) €39.90/mo
Total Monthly €11.89 (limited to 1TB) €51.79 (10TB)
Per TB Cost €11.89/TB €3.99/TB

Key insight: Adobe's cloud maxes out at 1TB. For professional photographers who need 5-50TB, S3 storage is the only scalable option.

Disaster Recovery

Your workflow is only as good as your ability to recover.

Complete Recovery Procedure

#!/bin/bash
# disaster-recovery.sh - Restore everything from S3

NEW_DRIVE="/Volumes/NewDrive"

echo "Starting disaster recovery..."

# 1. Restore Lightroom catalog (most recent)
LATEST_CATALOG=$(rclone lsf "danubedata:photography-backup/lightroom-catalog" | sort -r | head -1)
rclone sync "danubedata:photography-backup/lightroom-catalog/$LATEST_CATALOG" 
    "$NEW_DRIVE/Lightroom" 
    --progress

# 2. Restore presets
rclone sync "danubedata:photography-backup/presets" 
    "$HOME/Library/Application Support/Adobe" 
    --progress

# 3. Restore RAW files (this will take a while)
rclone sync "danubedata:photography-archive/raw" 
    "$NEW_DRIVE/RAW" 
    --progress 
    --transfers 16

echo "Recovery complete!"
echo "1. Open Lightroom"
echo "2. File → Open Catalog → $NEW_DRIVE/Lightroom/Lightroom Catalog.lrcat"
echo "3. Update folder locations if needed"

Get Started

Ready to build a professional Lightroom workflow with unlimited cloud storage?

  1. Create a DanubeData account
  2. Create a storage bucket for your photography archive
  3. Install rclone and configure your connection
  4. Choose your workflow (Smart Previews, Network Drive, or Auto-Backup)
  5. Start with your most recent shoots and work backwards

DanubeData S3 Storage:

  • €3.99/month includes 1TB storage + 1TB traffic
  • Additional storage €3.99/TB/month
  • No egress fees for normal usage
  • GDPR compliant (German data centers)
  • 99.9% uptime SLA

👉 Create Your Photography Storage Bucket

Need help setting up your Lightroom cloud workflow? Contact our team—we've helped dozens of photographers build bulletproof workflows.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.