BlogBusinessDigitalOcean Spaces Alternatives: European S3-Compatible Storage (2026)

DigitalOcean Spaces Alternatives: European S3-Compatible Storage (2026)

Adrian Silaghi
Adrian Silaghi
April 20, 2026
14 min read
1 views
#digitalocean #digitalocean-spaces #spaces-alternatives #object-storage #s3 #europe #gdpr
DigitalOcean Spaces Alternatives: European S3-Compatible Storage (2026)

DigitalOcean Spaces has been the default "good enough" S3-compatible storage for a lot of indie developers and small teams since its 2018 launch. $5/month, 250 GB included, 1 TB egress, built-in CDN, and the same SDK you already use for AWS S3. For a blog with a few hundred uploads or a side project serving static assets, it is hard to argue with.

But it is 2026, and the "good enough" calculus has shifted. The $5 base is now $5 for only 250 GB — and the moment your Postgres backups, user-uploaded images, or video assets cross that line, you pay $0.02/GB/month for storage and $0.01/GB for egress on top. The bundled CDN is a convenience tax: many workloads do not need one, and when you do, Bunny.net in Europe is usually faster and cheaper anyway. And for European companies, the parent-company-in-New-York CLOUD Act exposure is a persistent "maybe we should revisit that" on every security review.

This post benchmarks DigitalOcean Spaces against seven alternatives — five of them European — on real-world workloads: 1 TB of backups, 5 TB of user uploads, and 20 TB of media assets. We walk through an rclone-based migration playbook that takes most small projects from "Spaces bucket" to "new endpoint" in under an afternoon, and we flag the two SDK gotchas that will bite you if you are not expecting them.

What DigitalOcean Spaces Actually Gives You

Before we talk alternatives, let us be fair about what Spaces gets right. It is a deliberately simple product, and that is its main feature.

  • $5/month base fee. Includes 250 GB of storage and 1 TB of outbound transfer.
  • Overage pricing. $0.02/GB/month for storage above 250 GB, $0.01/GB for egress above the included TB.
  • S3-compatible API. Works with boto3, aws-sdk-js, Minio client, rclone, s3cmd, and basically anything that speaks S3.
  • Four regions. NYC3, SFO2/SFO3, AMS3 (Amsterdam), FRA1 (Frankfurt), BLR1, SYD1.
  • Built-in CDN. Powered by Cloudflare, one-click enable per bucket, custom subdomain support.
  • Spaces Access Keys. Per-account credentials, no fine-grained IAM (this will matter later).
  • Built-in static-site hosting. Plus a simple CORS editor in the control panel.

If you are running one bucket with 100 GB of assets and your users never scroll past the hero image, Spaces is fine. The problems show up when your storage grows, your egress grows, or your compliance posture matures.

Where DigitalOcean Spaces Gets Expensive (and Uncomfortable)

The 250 GB wall

This is the one that surprises people. The $5 base sounds great until you realize that 250 GB is nothing in 2026. A single WordPress site with a few hundred uploads can eat 50 GB. A Postgres cluster with point-in-time recovery snapshots stored in object storage can easily cross 500 GB in a year. A video platform with user-uploaded content hits 1 TB in weeks.

Once you cross 250 GB, you are paying $0.02/GB/month on the overage. That is:

  • 1 TB total → $5 base + $15.36 overage (750 GB over) = $20.36/month for storage alone
  • 5 TB total → $5 base + $97.28 overage = $102.28/month
  • 20 TB total → $5 base + $404.48 overage = $409.48/month

And that is before egress. If your app is serving user uploads (images, video) and traffic grows, you cross the 1 TB included egress quickly, and you are paying $10 per extra TB on top.

The CDN bundle you may not want

The built-in CDN is marketed as a feature, but for a lot of workloads it is the wrong tool:

  • Backups do not need a CDN. You are writing sequentially from one server and reading sequentially on restore.
  • Internal analytics pipelines (Parquet files, NDJSON dumps) do not need a CDN — queries come from your own compute.
  • Image pipelines often want a dedicated image CDN (Bunny Optimizer, imgix, Cloudinary) with on-the-fly resizing — a pass-through CDN is not enough.

You can disable the CDN, but it does not lower your base bill. With dedicated S3 storage plus a dedicated CDN like Bunny.net or BunnyCDN (Europe-based, ~€0.01/GB), you usually end up paying less and getting better edge performance.

No fine-grained IAM

Spaces credentials are account-level. There is no AWS-style IAM policy attached to a specific bucket or prefix. If your app has two customers and you want per-customer bucket scoping, you either have to trust your application-level code or use multiple DigitalOcean teams. For production SaaS with multi-tenant workloads, this becomes a real limitation.

CLOUD Act and data residency

DigitalOcean is a Delaware corporation headquartered in New York. Under the US CLOUD Act (2018), US-based cloud providers can be compelled to produce customer data regardless of where that data is physically stored. Hosting your bucket in the AMS3 region does not change that — the subpoena goes to the parent company, not the data center.

For a lot of teams this is a theoretical concern. For European SaaS companies with finance, healthcare, or government customers, it is a contractual dealbreaker: their DPO will not sign off on a US-parent processor for primary storage.

The European S3 Landscape (2026)

Here is the headline comparison. All prices in USD or EUR as published; we have normalized egress pricing to dollars per GB for readability.

Provider Base price Storage included Egress included Storage overage Egress overage EU-only data
DigitalOcean Spaces $5/mo 250 GB 1 TB $0.02/GB $0.01/GB EU region, US parent
DanubeData Object Storage €3.99/mo 1 TB 1 TB €0.01/GB €0.01/GB Yes (DE)
Hetzner Object Storage €5.99/mo 1 TB 1 TB €5.99/TB €1/TB Yes (DE/FI)
Scaleway Object Storage Free tier 75 GB 75 GB €0.0146/GB €0.01/GB Yes (FR/NL/PL)
OVHcloud Object Storage Pay-as-you-go None None €0.0075/GB €0.011/GB Yes (FR/DE/UK)
Exoscale SOS Pay-as-you-go None None €0.02/GB €0.02/GB Yes (CH/DE/AT/BG)
Cloudflare R2 Pay-as-you-go 10 GB free Unlimited (free) $0.015/GB $0/GB Global, US parent
Backblaze B2 Pay-as-you-go 10 GB free 3x storage (free) $0.006/GB $0.01/GB EU region, US parent

A few callouts that are not obvious from the table:

  • DanubeData is the only one where the base plan includes a full 1 TB of both storage and egress for under €5. Spaces has you at 250 GB for $5; DanubeData has you at 1,024 GB for €3.99.
  • Hetzner is close on price, but overages are billed in whole-TB increments (€5.99/TB), which can be unfriendly for small projects that bounce just over 1 TB.
  • Cloudflare R2 has the killer egress story — zero egress fees — but you are back to the US-parent issue and R2 does not have EU-specific regions, just global edge.
  • Backblaze B2 has the cheapest raw storage ($0.006/GB) but adds up at scale when you need redundancy, and EU region is available but behind a US parent.
  • OVH and Exoscale are enterprise-friendly but require pay-as-you-go setup — no simple "pick a plan" UX.

Real Cost at 1 TB, 5 TB, and 20 TB

Base pricing tables are abstract. Let us do the real math for three common workloads. Assume 1:1 egress-to-storage ratio (you read about as much as you store each month) for realism.

Workload A: 1 TB total storage, 1 TB monthly egress

A mid-sized WordPress site or small SaaS with user uploads and a healthy audience.

Provider Storage cost Egress cost Monthly total
DigitalOcean Spaces $5 + $15.36 (750 GB over) $0 (within 1 TB) $20.36
DanubeData €3.99 (within 1 TB) €0 (within 1 TB) €3.99 (~$4.30)
Hetzner €5.99 (within 1 TB) €0 (within 1 TB) €5.99 (~$6.45)
Scaleway €13.50 (949 GB billable) €9.49 (949 GB billable) €22.99 (~$24.75)
OVH €7.68 €11.26 €18.94 (~$20.40)
Cloudflare R2 $15.36 $0 $15.36

Workload B: 5 TB total storage, 2 TB monthly egress

A media startup or a SaaS serving PDFs, exports, and document storage to paying customers.

Provider Storage cost Egress cost Monthly total
DigitalOcean Spaces $5 + $97.28 $10.24 (1 TB over) $112.52
DanubeData €3.99 + €40.96 (4 TB over) €10.24 (1 TB over) €55.19 (~$59.50)
Hetzner €5.99 + €24 (4 TB extra) €1 (1 TB over) €30.99 (~$33.40)
Scaleway €71.81 €19.73 €91.54 (~$98.60)
OVH €38.40 €22.53 €60.93 (~$65.60)
Cloudflare R2 $76.80 $0 $76.80

Workload C: 20 TB total storage, 5 TB monthly egress

A growing video platform, a SaaS with heavy data archival, or a company running long-retention backup archives.

Provider Storage cost Egress cost Monthly total
DigitalOcean Spaces $5 + $404.48 $40.96 (4 TB over) $450.44
DanubeData €3.99 + €194.56 (19 TB over) €40.96 (4 TB over) €239.51 (~$258)
Hetzner €5.99 + €113.81 (19 TB extra) €4 (4 TB over) €123.80 (~$133)
Scaleway €290.85 €50.49 €341.34 (~$368)
OVH €153.60 €56.32 €209.92 (~$226)
Cloudflare R2 $307.20 $0 $307.20

Takeaways:

  • For small workloads under 1 TB, DanubeData is the clearest value — €3.99 flat vs. $20+ on Spaces.
  • For mid-size workloads (1–10 TB), Hetzner and DanubeData trade places depending on your egress-to-storage ratio; both roughly halve the Spaces bill.
  • For very large storage with low egress, Hetzner and Backblaze B2 are the cheapest pure-storage options.
  • For very high egress relative to storage (think video streaming, software distribution), Cloudflare R2 is hard to beat — but you trade EU residency and US-parent-company status for it.

When Each Alternative Makes Sense

DanubeData (€3.99/mo)

The best Spaces replacement if you are already running app servers or managed databases on DanubeData, if you care about a predictable flat-rate base, and if you want a German data center with no US parent company. The 1 TB included traffic and storage is roughly 4× what Spaces gives you at a lower base price. Endpoint: s3.danubedata.ro.

Hetzner Object Storage (€5.99/mo)

Great for cold storage and archival. If you are already a Hetzner customer and your workload is storage-heavy with relatively low egress, this is the cheapest EU-native option at scale. The TB-increment billing is the main friction for small projects.

Scaleway Object Storage

Good fit for French-regulated workloads (SecNumCloud-adjacent needs), or if you are already on Scaleway compute. Free tier (75 GB) is useful for small staging buckets. Pricing is reasonable but not competitive at scale compared to Hetzner/DanubeData.

OVHcloud Object Storage

Pay-as-you-go model suits unpredictable workloads. Strong French sovereignty story. UI is more complex than DanubeData or Hetzner, and pricing is granular — good for finance teams that want to forecast costs line-by-line.

Exoscale SOS

Swiss-based alternative if you need Swiss-only data residency (finance, insurance, cross-border privacy concerns). Small, stable team, strong SLAs, but prices are higher than the German/French options.

Cloudflare R2

Unbeatable for high-egress workloads because egress is free. Zero regional data-residency guarantees (your data lives globally across Cloudflare edge), and the parent company is US-based. Best for: software distribution, video CDNs, assets served to a global audience.

Backblaze B2

Cheapest raw-storage price point ($0.006/GB), and the Bandwidth Alliance with Cloudflare gives you free egress if you front with Cloudflare. EU region (Amsterdam) exists but parent is US. Good for backup-only workloads where egress rarely happens.

Self-hosted MinIO (on a DanubeData VPS)

If you need full control, versioning policies you define yourself, or your compliance audit requires isolated single-tenant storage, running MinIO on a €4.49 DanubeData VPS with NVMe storage costs less than $6/month for ~80 GB and scales up with bigger plans. You are responsible for erasure coding, backups, and uptime, but the API surface is a drop-in for S3.

Migration Playbook: DigitalOcean Spaces → DanubeData (or any S3)

The good news: because Spaces is S3-compatible and so is every alternative in this list, migrations are mostly about copying bytes and swapping endpoints. Here is the playbook we walk customers through — it applies almost verbatim to any destination.

Step 1: Inventory your Spaces usage

# List all your Spaces buckets
s3cmd ls --access_key=YOUR_KEY --secret_key=YOUR_SECRET --host=nyc3.digitaloceanspaces.com --host-bucket='%(bucket)s.nyc3.digitaloceanspaces.com'

# Get total size per bucket
s3cmd du s3://your-bucket --access_key=YOUR_KEY --secret_key=YOUR_SECRET --host=nyc3.digitaloceanspaces.com --host-bucket='%(bucket)s.nyc3.digitaloceanspaces.com'

You want to know: total object count, total bytes, largest files, and whether any buckets use the built-in CDN (you will need to replace that).

Step 2: Create the destination bucket on DanubeData

  1. Sign up at danubedata.ro (€50 signup credit).
  2. Create a Storage Bucket in your dashboard — naming is scoped to your team: dd-{team_id}-{name}.
  3. Create an access key pair for this bucket.
  4. Note your endpoint: https://s3.danubedata.ro.

Step 3: Configure rclone for both endpoints

rclone is the tool of choice for S3 bucket migrations — it handles resumable multipart uploads, parallel transfers, and integrity checking.

# ~/.config/rclone/rclone.conf

[spaces]
type = s3
provider = DigitalOcean
access_key_id = YOUR_DO_ACCESS_KEY
secret_access_key = YOUR_DO_SECRET_KEY
endpoint = nyc3.digitaloceanspaces.com
acl = private

[danube]
type = s3
provider = Other
access_key_id = YOUR_DANUBE_ACCESS_KEY
secret_access_key = YOUR_DANUBE_SECRET_KEY
endpoint = https://s3.danubedata.ro
acl = private
region = fsn1

Step 4: Do a dry-run sync

# Verify connection to both sides first
rclone lsd spaces:
rclone lsd danube:

# Dry run (no data movement yet)
rclone sync spaces:your-bucket danube:dd-YOUR-TEAM-your-bucket 
  --progress 
  --dry-run 
  --checksum

Inspect the output. If rclone shows the full object list and a reasonable estimate, you are good.

Step 5: Run the real sync

# Real sync with parallelism tuned for network
rclone sync spaces:your-bucket danube:dd-YOUR-TEAM-your-bucket 
  --progress 
  --checksum 
  --transfers 16 
  --checkers 32 
  --fast-list 
  --log-file=/var/log/spaces-migration.log 
  --log-level INFO

Expect throughput of 50–200 MB/s depending on your source and destination networks. A 1 TB bucket typically completes in 2–6 hours.

Step 6: Verify object parity

# Count objects on both sides
rclone size spaces:your-bucket
rclone size danube:dd-YOUR-TEAM-your-bucket

# If numbers match, do a checksum verification
rclone check spaces:your-bucket danube:dd-YOUR-TEAM-your-bucket --one-way

Step 7: Swap your SDK endpoints

This is the tricky part. Here is what changes in your code:

Before (DigitalOcean Spaces in Python boto3):

import boto3

s3 = boto3.client(
    's3',
    endpoint_url='https://nyc3.digitaloceanspaces.com',
    aws_access_key_id='YOUR_KEY',
    aws_secret_access_key='YOUR_SECRET',
    region_name='nyc3',
)

After (DanubeData):

import boto3

s3 = boto3.client(
    's3',
    endpoint_url='https://s3.danubedata.ro',
    aws_access_key_id='YOUR_DANUBE_KEY',
    aws_secret_access_key='YOUR_DANUBE_SECRET',
    region_name='fsn1',
)

Same pattern for Node, Go, PHP — three things change: endpoint URL, credentials, region name.

Step 8: Redirect CDN and DNS (if applicable)

If you were using the built-in Spaces CDN at something like assets.yourdomain.com:

  1. Set up a CDN in front of DanubeData. Our go-to recommendation is Bunny.net (Europe-based, ~€0.01/GB egress) because it has EU-specific PoPs and does not carry the US-parent-company baggage. Cloudflare is also fine.
  2. In Bunny.net, create a pull-zone pointing to https://s3.danubedata.ro/dd-YOUR-TEAM-your-bucket.
  3. Update your CNAME: assets.yourdomain.com → Bunny.net edge hostname.
  4. TTL-wait. Watch your CDN hit rate stabilize over 24 hours.

Step 9: Incremental sync during cutover

For a live application, you usually cannot pause writes during the migration. Run an incremental sync right before cutover:

# Final incremental sync — only transfers objects that changed
rclone sync spaces:your-bucket danube:dd-YOUR-TEAM-your-bucket 
  --progress 
  --checksum 
  --transfers 16 
  --update

Then switch your application endpoint. Keep Spaces as a read-only fallback for a week, then delete.

Step 10: Presigned URLs — the gotcha

Presigned URLs generated against the old Spaces endpoint will not work on DanubeData. Same for any saved URLs stored in your database (e.g., in an images table with public_url columns pointing at xxx.nyc3.digitaloceanspaces.com).

Before cutover, run a SQL migration to rewrite stored URLs:

-- Example for a Postgres table
UPDATE uploads
SET public_url = REPLACE(
  public_url,
  'https://yourbucket.nyc3.digitaloceanspaces.com/',
  'https://s3.danubedata.ro/dd-YOUR-TEAM-your-bucket/'
)
WHERE public_url LIKE '%nyc3.digitaloceanspaces.com%';

Presigned URLs must be regenerated against the new endpoint going forward. If your app is generating them on-demand for each request, changing the endpoint_url in your SDK is enough. If you cache presigned URLs, bust the cache.

Use Case Notes

Apps with user uploads

This is where Spaces alternatives shine. 1 TB of user-uploaded images/PDFs/documents on DanubeData is €3.99/month; on Spaces it is north of $20/month. Plus you get real per-bucket access keys (via DanubeData access key scoping) that you can rotate without affecting other buckets.

Static assets and front-ends

If you are doing JAMstack or static-site hosting, point Bunny.net or Cloudflare at your DanubeData bucket and you have a production-grade CDN that beats most bundled options on latency within Europe. For really static sites, consider DanubeData serverless containers or static site hosting instead of object storage.

Backups

Object storage is the "correct" destination for Postgres WAL archive, Redis RDB snapshots, Velero Kubernetes backups, and restic repositories. For backups you rarely egress, so DanubeData or Hetzner are both extremely cheap. A 500 GB Postgres backup set with daily incrementals costs €3.99/month on DanubeData vs. $10+ on Spaces.

Media hosting (images, video)

Bandwidth matters a lot here. If most of your egress is European, Bunny.net + DanubeData is a great combo. If it is global and egress-heavy, Cloudflare R2 is the egress-cost winner — but you lose EU-only residency.

DanubeData vs. Spaces — The TL;DR Pitch

  • 4x the storage for a lower base. €3.99/month for 1 TB storage + 1 TB egress vs. $5/month for 250 GB + 1 TB egress.
  • No US parent. Hetzner dedicated servers in Falkenstein, Germany. GDPR-first, no CLOUD Act exposure.
  • S3-compatible. s3.danubedata.ro endpoint works with every S3 SDK you already use.
  • Bundled with the rest of your stack. If you also need VPS, Postgres, Redis, or serverless — all on the same dashboard, same billing, €50 signup credit covers ~12 months of a base bucket.
  • Real multi-tenancy. Bucket isolation per team, per-team access keys, proper resource quotas.

FAQ

What replaces the built-in CDN from DigitalOcean Spaces?

Bunny.net is the top pick for EU-hosted buckets — ~€0.01/GB, dozens of EU PoPs, no parent-company baggage, simple pull-zone setup. Cloudflare is also fine but not strictly EU. For image-heavy workloads, Bunny Optimizer (their image CDN product) gives you on-the-fly resizing, WebP conversion, and quality tuning, all off your origin.

Does DanubeData support lifecycle rules?

Yes. You can configure lifecycle rules on your bucket to expire or transition objects by age or prefix — same semantics as S3 lifecycle policies. Useful for automatic cleanup of old backups or temporary uploads.

Does DanubeData support object versioning?

Yes. Enable versioning at the bucket level and every overwrite or delete preserves the prior object version. This is a must-have for backup buckets — it protects against ransomware or accidental rclone --delete runs. Spaces has this too, but the DanubeData UI exposes it more directly.

Can I have public and private buckets?

Yes. Buckets are private by default. You can flip public-read ACLs per object or per bucket, or mix them — private for backups, public-read for static assets. Exactly the same model as S3 and Spaces.

How does pricing actually compare for a 500 GB WordPress site with 300 GB monthly egress?

On Spaces: $5 base + $5 overage (250 GB over) = $10/month. On DanubeData: €3.99/month flat — the 500 GB storage and 300 GB egress are both within the included 1 TB. That is a ~60% saving, and the margin grows as your storage grows.

Is DanubeData actually GDPR-compliant?

Yes. All infrastructure is in Falkenstein, Germany, on Hetzner dedicated servers. DanubeData is an EU company with no US parent, so no CLOUD Act exposure. We sign Data Processing Agreements (DPAs) for business customers, and our sub-processor list is limited to EU-based operators. Many EU-regulated customers (finance, legal, healthcare-adjacent) use DanubeData specifically to avoid US-parent processors.

Do presigned URLs work the same way?

Yes — the S3 SigV4 presigning algorithm is identical. The only difference is the endpoint and credentials. Any library that generates presigned URLs (boto3's generate_presigned_url, aws-sdk-js getSignedUrl, PHP createPresignedRequest) works verbatim once you swap the endpoint URL. The URL lifetime, HTTP method, and query-string parameters are all standard S3.

What about multipart uploads for large files?

Multipart uploads are fully supported at DanubeData, with a 10,000-part limit and part sizes from 5 MiB to 5 GiB — same as S3 and Spaces. rclone, aws-cli, and every S3 SDK will use multipart by default for uploads over ~64 MB, so you get resumability and parallel chunk uploads out of the box.

Get Started

The move off Spaces is usually a 2–4 hour afternoon project for sub-TB buckets, and a weekend for anything up to 10 TB. The bigger win is that you stop paying the 250-GB wall every month and you get a storage layer bundled with the rest of your European infrastructure.

  1. Sign up at danubedata.ro (€50 signup credit — covers ~12 months of a base bucket).
  2. Create a Storage Bucket in the dashboard.
  3. Configure rclone with both endpoints and run the sync.
  4. Swap your SDK endpoint and redirect your CDN/CNAME.
  5. Watch your bill drop by 60–80%.

If you are migrating a larger or more complex workload and want a hand, reach out — we have done enough Spaces migrations that we can usually tell you exactly where the rough edges are before you hit them.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.