If you run a website in the EU in 2026, you already know the drill. You open Google Analytics, try to figure out where your traffic came from, get nagged about consent mode v2, stare at GA4's "engaged sessions" metric wondering what it actually means, and then remember that your legal team has been asking for six months whether this is even lawful under GDPR.
There is a simpler path: run your own analytics. Two open-source tools dominate the conversation outside of Plausible (covered in a separate guide): Matomo, the mature, feature-rich platform that is essentially the WordPress of analytics, and Umami, the modern, minimal, cookieless newcomer built on Node.js and Postgres.
This guide walks you through choosing between them, deploying either (or both) on a European VPS using Docker, wiring up managed databases for painless backups, hardening for GDPR, and migrating away from GA4. By the end, you will own your data, need zero cookie banners for analytics, and spend less than €35/month on the whole stack.
The GA4 Problem (Why You Are Here)
Google Analytics has real problems in 2026 that are not going away:
- GDPR grey zone. Multiple European data protection authorities (Austria, France, Italy, Denmark, Norway, Finland) have ruled that standard GA configurations violate GDPR because user data is transferred to the US. Google's mitigations (Consent Mode, EU data centers, server-side tagging) reduce but do not eliminate the risk.
- Cookie banner fatigue. Every visit starts with a consent popup. Rejection rates of 40-60% are normal, which means GA4 is missing huge chunks of your actual traffic even when it is legal.
- Sampled, delayed data. Free-tier GA4 samples data above certain thresholds and batches events. "Real-time" is not really real-time.
- The GA4 UI. If you loved Universal Analytics, you probably hate GA4. The new event-based model is powerful but overwhelming for the 90% use case of "how many people visited the blog this week."
- You are the product. Your visitor data feeds Google Ads, Google Search, and the broader ad ecosystem. For a lot of businesses — legal, medical, financial, B2B SaaS — that alone is disqualifying.
Self-hosting flips all of this. Your visitors' data stays on your server, in your jurisdiction. You can turn off cookies entirely. Data is unsampled, immediate, and you keep it forever.
Matomo vs Umami: The Short Version
Both are strong choices. Pick based on what you actually need.
Choose Matomo if: you want a full analytics platform with heatmaps, session recordings, funnels, A/B testing, e-commerce tracking, goal tracking, SEO keyword imports, custom reports, multi-user teams with granular permissions, and a migration path from Universal Analytics/GA4. Matomo is the closest feature-for-feature replacement for "big" analytics tools.
Choose Umami if: you want a beautiful, fast, minimal dashboard that answers "who visited, from where, on what page" in three clicks. Umami is cookieless by default, under 2KB of tracking script, and feels like a modern product designed in 2024 rather than something that grew organically since 2007.
Not sure? Umami is the easier first step — smaller, cheaper, faster to deploy. You can always add Matomo later for a specific site that needs advanced features. Several DanubeData customers run both: Umami as their daily dashboard, Matomo on their e-commerce store.
Full Comparison Table
| Feature | GA4 | Matomo Cloud | Matomo Self-Hosted | Umami Cloud | Umami Self-Hosted |
|---|---|---|---|---|---|
| Monthly cost (100k pageviews) | Free* | €23/mo | ~€32/mo (VPS + DB) | $9/mo | ~€24/mo (VPS + DB) |
| Data ownership | InnoCraft (NZ/EU) | You | Umami Software | You | |
| Cookieless option | Limited | Yes | Yes | Default | Default |
| Cookie banner needed | Yes | Configurable | No (if configured) | No | No |
| Tracking script size | ~50KB | ~22KB | ~22KB | ~2KB | ~2KB |
| Heatmaps / session recording | No | Yes (paid) | Yes (plugin) | No | No |
| E-commerce tracking | Yes | Yes | Yes (built-in) | Custom events | Custom events |
| Funnels | Yes | Yes (paid) | Yes (plugin) | Basic | Basic |
| A/B testing | No | Yes (paid) | Yes (plugin) | No | No |
| Multi-user teams | Yes | Yes | Yes (fine-grained) | Yes | Yes |
| Unsampled data | Sampled >10M | Yes | Yes | Yes | Yes |
| Data retention | 14 months | Unlimited | Unlimited | Unlimited | Unlimited |
| GA4 import tool | - | Yes | Yes | No | No |
| Minimum RAM | - | - | 1-2 GB + MySQL | - | ~200 MB + Postgres |
| Database | Managed | MySQL / MariaDB | Managed | Postgres / MySQL | |
| Tech stack | - | - | PHP 8 + MySQL | - | Node.js + Postgres |
*GA4 is "free" — you pay by giving Google every visitor interaction on your site.
Hardware Requirements
This is where the two products diverge dramatically. Plan your VPS accordingly.
Umami
- RAM: ~200 MB idle, ~400 MB under load. Node.js is lean.
- CPU: A single shared vCPU handles millions of pageviews/month easily.
- Storage: Tiny. Events are simple rows in Postgres. 1 million pageviews ≈ 200 MB DB size.
- Recommended: DanubeData DD Nano at €4.49/mo (1 vCPU, 1 GB RAM, 20 GB NVMe) is overkill for most Umami deployments.
Matomo
- RAM: 1-2 GB for the PHP app itself, plus 512 MB-1 GB for MySQL, plus headroom for the archive cron job that aggregates reports.
- CPU: Shared vCPU is fine for under ~500k pageviews/month. Above that, consider dedicated CPU so archiving does not lag.
- Storage: Heavier. Matomo stores raw visit logs by default. Budget 1-2 GB per million pageviews including archives.
- Recommended: DanubeData DD Small at €12.49/mo (2 vCPU, 4 GB RAM, 40 GB NVMe) is the sweet spot for up to a few million pageviews/month across multiple sites.
Managed Database or Self-Managed?
You have two options for the database:
- Run it in Docker alongside the app. Simpler setup, cheaper (just the VPS cost), but you own backups, tuning, and upgrades.
- Use DanubeData managed databases. Managed Postgres or MariaDB at €19.99/mo includes daily snapshots, point-in-time recovery, automated upgrades, TLS, and a replica option. Worth it if you do not want analytics to be another thing to babysit.
For this guide we show both: a quick Docker-only setup for hobbyists, and a production setup that plugs into a managed Postgres/MariaDB instance.
Part 1: Deploy Umami
Umami is the fastest thing you can deploy today — it will be running in about ten minutes.
1.1 Provision the VPS
- Create a DD Nano VPS (€4.49/mo) at DanubeData Falkenstein.
- Choose Ubuntu 24.04 LTS.
- Note the public IPv4 and IPv6.
- Create an A (and optionally AAAA) DNS record:
umami.yourdomain.com → YOUR_IP.
1.2 Base setup
ssh root@YOUR_IP
apt update && apt upgrade -y
apt install -y curl wget git ufw fail2ban
ufw allow OpenSSH
ufw allow 80/tcp
ufw allow 443/tcp
ufw --force enable
curl -fsSL https://get.docker.com | sh
apt install -y docker-compose-plugin
systemctl enable --now docker
mkdir -p /opt/umami && cd /opt/umami
1.3 docker-compose.yml for Umami
cat > docker-compose.yml << 'EOF'
services:
umami:
image: ghcr.io/umami-software/umami:postgresql-latest
restart: always
environment:
DATABASE_URL: postgresql://umami:${POSTGRES_PASSWORD}@db:5432/umami
DATABASE_TYPE: postgresql
APP_SECRET: ${APP_SECRET}
TRACKER_SCRIPT_NAME: "metrics" # obscure against adblockers
depends_on:
db:
condition: service_healthy
expose:
- "3000"
db:
image: postgres:16-alpine
restart: always
environment:
POSTGRES_DB: umami
POSTGRES_USER: umami
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
volumes:
- umami-db:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U umami -d umami"]
interval: 5s
timeout: 5s
retries: 5
caddy:
image: caddy:2-alpine
restart: always
ports:
- "80:80"
- "443:443"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile
- caddy-data:/data
- caddy-config:/config
depends_on:
- umami
volumes:
umami-db:
caddy-data:
caddy-config:
EOF
1.4 Secrets and Caddy config
cat > .env << EOF
POSTGRES_PASSWORD=$(openssl rand -hex 24)
APP_SECRET=$(openssl rand -base64 48)
EOF
chmod 600 .env
cat > Caddyfile << 'EOF'
umami.yourdomain.com {
reverse_proxy umami:3000
encode gzip zstd
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains"
X-Content-Type-Options nosniff
Referrer-Policy strict-origin-when-cross-origin
}
}
EOF
1.5 Launch
docker compose up -d
docker compose logs -f umami
# wait for "ready - started server on..."
Open https://umami.yourdomain.com and log in with the default credentials admin / umami. Change the password immediately from Settings → Profile.
1.6 Using DanubeData managed Postgres instead
If you prefer not to run the database yourself, provision a managed Postgres instance (€19.99/mo) and drop the db: service from the compose file. Then set:
DATABASE_URL=postgresql://umami:PASSWORD@HOSTNAME:5432/umami?sslmode=require
Create the umami database and user via the DanubeData dashboard or psql before first launch. Daily snapshots, automated point-in-time recovery, and TLS are included — one fewer thing to think about.
1.7 Add the tracking snippet
In the Umami dashboard, add your website. Copy the snippet and paste it into your HTML <head>:
<script defer
src="https://umami.yourdomain.com/metrics.js"
data-website-id="YOUR-WEBSITE-ID"></script>
Note we renamed the tracker to metrics.js via TRACKER_SCRIPT_NAME above — this reduces the adblock-block rate from ~30% to under 5% because most blocklists target the default filename.
1.8 Custom events in Umami
// Fire a custom event
umami.track('signup-clicked');
umami.track('newsletter-subscribe', { plan: 'pro' });
// With dynamic data
umami.track(props => ({ ...props, name: 'page-scroll', data: { depth: 75 } }));
Part 2: Deploy Matomo
Matomo is a heavier lift — PHP-FPM, MySQL/MariaDB, cron-driven archives — but it gives you a lot more once it is running.
2.1 Provision the VPS and database
- Create a DD Small VPS (€12.49/mo): 2 vCPU, 4 GB RAM, 40 GB NVMe.
- Provision a managed MariaDB (€19.99/mo). Matomo officially supports MySQL 8 and MariaDB 10.11+ — both work, and managed saves you hours of DBA time.
- In the managed DB dashboard, create a database
matomoand a usermatomowith a strong password. Copy the connection hostname. - DNS: point
analytics.yourdomain.comat the VPS.
2.2 Base setup
ssh root@YOUR_IP
apt update && apt upgrade -y
apt install -y curl wget git ufw fail2ban
ufw allow OpenSSH
ufw allow 80/tcp
ufw allow 443/tcp
ufw --force enable
curl -fsSL https://get.docker.com | sh
apt install -y docker-compose-plugin
systemctl enable --now docker
mkdir -p /opt/matomo && cd /opt/matomo
2.3 docker-compose.yml for Matomo (managed DB)
cat > docker-compose.yml << 'EOF'
services:
matomo:
image: matomo:5-apache
restart: always
environment:
MATOMO_DATABASE_HOST: ${DB_HOST}
MATOMO_DATABASE_ADAPTER: mysql
MATOMO_DATABASE_TABLES_PREFIX: matomo_
MATOMO_DATABASE_USERNAME: matomo
MATOMO_DATABASE_PASSWORD: ${DB_PASSWORD}
MATOMO_DATABASE_DBNAME: matomo
PHP_MEMORY_LIMIT: 512M
volumes:
- matomo-data:/var/www/html
expose:
- "80"
# Archive cron keeps reports fresh; default web-triggered archive is fine for tiny sites
matomo-cron:
image: matomo:5-apache
restart: always
entrypoint: /bin/bash
command: -c "while true; do php /var/www/html/console core:archive --url=https://analytics.yourdomain.com; sleep 3600; done"
environment:
MATOMO_DATABASE_HOST: ${DB_HOST}
MATOMO_DATABASE_USERNAME: matomo
MATOMO_DATABASE_PASSWORD: ${DB_PASSWORD}
MATOMO_DATABASE_DBNAME: matomo
volumes:
- matomo-data:/var/www/html
depends_on:
- matomo
caddy:
image: caddy:2-alpine
restart: always
ports:
- "80:80"
- "443:443"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile
- caddy-data:/data
- caddy-config:/config
depends_on:
- matomo
volumes:
matomo-data:
caddy-data:
caddy-config:
EOF
2.4 Secrets and Caddy config
cat > .env << EOF
DB_HOST=YOUR_MANAGED_DB_HOSTNAME
DB_PASSWORD=YOUR_MANAGED_DB_PASSWORD
EOF
chmod 600 .env
cat > Caddyfile << 'EOF'
analytics.yourdomain.com {
reverse_proxy matomo:80
encode gzip zstd
# Matomo has its own login, but extra headers never hurt
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains"
X-Content-Type-Options nosniff
X-Frame-Options SAMEORIGIN
Referrer-Policy strict-origin-when-cross-origin
}
# Block access to sensitive config files
@blocked path /config/config.ini.php /tmp/* /plugins/*/config/*
respond @blocked 403
}
EOF
2.5 Launch and install
docker compose up -d
docker compose logs -f matomo
Open https://analytics.yourdomain.com. Matomo will walk you through the installer:
- System check — all green if you followed the steps.
- Database setup — it pre-fills from your env vars.
- Super user — create your admin account. Use a strong password and enable 2FA right after install.
- First website — add your domain and timezone.
- Tracking code — copy the snippet.
2.6 Add the tracking snippet
<!-- Matomo -->
<script>
var _paq = window._paq = window._paq || [];
_paq.push(['disableCookies']); // cookieless mode
_paq.push(['setDoNotTrack', true]); // honour DNT
_paq.push(['trackPageView']);
_paq.push(['enableLinkTracking']);
(function() {
var u = "https://analytics.yourdomain.com/";
_paq.push(['setTrackerUrl', u + 'matomo.php']);
_paq.push(['setSiteId', '1']);
var d = document, g = d.createElement('script'), s = d.getElementsByTagName('script')[0];
g.async = true; g.src = u + 'matomo.js'; s.parentNode.insertBefore(g, s);
})();
</script>
<!-- End Matomo Code -->
The two key lines for GDPR compliance are disableCookies and setDoNotTrack. Combined with Matomo's built-in IP anonymization (see below), you can run without a consent banner in most EU jurisdictions.
2.7 Harden Matomo for GDPR
In Matomo admin → Privacy:
- Anonymize Visitors' IP addresses: set to 2 bytes (e.g. 192.168.x.x).
- Anonymize previously tracked data: run the retro-anonymizer once after install if you imported historical data.
- Delete old visitor logs: automatically after 90-180 days. Archives remain — you lose individual hit data but keep aggregate reports.
- Respect DoNotTrack: on.
- Opt-out iframe: embed on your privacy page so visitors can opt out explicitly.
- Disable geolocation at city level if you only need country-level data — lower risk, still useful.
Part 3: GDPR — What Actually Matters
Switching analytics tools does not automatically make you GDPR-compliant. Here is the checklist your DPO will actually ask about.
3.1 Lawful basis
For analytics you will typically rely on one of:
- Legitimate interest — viable for cookieless, IP-anonymized, first-party analytics where processing is minimal. This is the strongest argument for self-hosted Matomo/Umami.
- Consent — required if you use cookies, full IPs, or cross-site tracking. A cookie banner is mandatory.
With disableCookies + IP anonymization + data retention limits, both Matomo and Umami can run under legitimate interest in most EU member states. Always document your Data Protection Impact Assessment (DPIA).
3.2 Data minimization
- Do not collect user IDs unless you need them (e.g. logged-in analytics).
- Truncate IPs (Umami does not store them at all; Matomo anonymizes by config).
- Set retention windows. Umami: 365 days via cron. Matomo: Privacy → Anonymize data.
3.3 Data residency
DanubeData runs in Falkenstein, Germany. Your data never leaves the EU. No Standard Contractual Clauses, no Schrems II problem, no transfer impact assessment needed. This is the single biggest reason EU customers switch from GA4.
3.4 Data subject rights
- Right to erasure: Matomo has a built-in GDPR tools page. Umami requires a SQL DELETE by website_id + session_id or IP hash.
- Right of access: Matomo generates a per-visitor report. For Umami, query the
website_eventtable. - Right to opt out: both tools respect DNT; Matomo ships an official opt-out iframe.
3.5 Your privacy policy
Update your privacy policy to mention:
- The analytics tool you use (Matomo or Umami), self-hosted.
- Server location (Falkenstein, Germany — DanubeData).
- Whether cookies are used (usually: no).
- IP anonymization (yes).
- Retention period.
- A link to your opt-out.
Part 4: Migrating from Google Analytics 4
4.1 To Matomo
Matomo ships an official Google Analytics 4 Importer plugin (free for on-prem):
- In Matomo → Marketplace, install "Google Analytics Importer".
- Authenticate with Google OAuth (read-only on your GA4 property).
- Select date range. GA4 exports via the Reporting API — expect hours to days for large properties, and note the known limitation that some custom dimensions do not map cleanly.
- Imported data lands in a separate site ID so you can compare side-by-side.
Realistic expectation: user counts will not match GA4 exactly. GA4's modeled data (from Consent Mode) inflates numbers; Matomo shows only what actually happened. Treat the gap as a learning moment, not a bug.
4.2 To Umami
Umami has no importer. Options:
- Parallel-run for a month. Install both GA4 and Umami. Compare. Most teams find Umami "close enough" for day-to-day use.
- Export GA4 to BigQuery → CSV → Postgres. Only worthwhile if you have multi-year historical data that matters.
- Accept the reset. Most sites honestly do not need historical data past a year. Starting clean is liberating.
4.3 Parallel-run period
Regardless of which tool you pick, run GA4 and your new analytics in parallel for at least 30 days. Do not turn off GA4 until you are confident the new setup captures what you need and your executive dashboards have been rebuilt.
Part 5: Backups to S3
Neither tool is useful if you lose the data. Ship nightly backups to object storage.
5.1 Umami Postgres backup
cat > /opt/umami/backup.sh << 'EOF'
#!/bin/bash
set -e
DATE=$(date +%Y-%m-%d_%H-%M-%S)
BACKUP_DIR=/opt/umami/backups
mkdir -p $BACKUP_DIR
docker compose -f /opt/umami/docker-compose.yml exec -T db
pg_dump -U umami umami | gzip > $BACKUP_DIR/umami_$DATE.sql.gz
# Ship to DanubeData S3 (or any S3-compatible)
rclone copy $BACKUP_DIR/umami_$DATE.sql.gz danubedata:umami-backups/
find $BACKUP_DIR -name "*.sql.gz" -mtime +7 -delete
EOF
chmod +x /opt/umami/backup.sh
(crontab -l 2>/dev/null; echo "0 3 * * * /opt/umami/backup.sh >> /var/log/umami-backup.log 2>&1") | crontab -
5.2 Matomo backup (managed MariaDB)
If you use DanubeData managed MariaDB, daily snapshots are automatic — you can skip the dump. For extra safety:
cat > /opt/matomo/backup.sh << 'EOF'
#!/bin/bash
set -e
DATE=$(date +%Y-%m-%d_%H-%M-%S)
BACKUP_DIR=/opt/matomo/backups
mkdir -p $BACKUP_DIR
# mysqldump against managed DB
docker run --rm --env-file /opt/matomo/.env mariadb:11
sh -c 'mariadb-dump -h "$DB_HOST" -u matomo -p"$DB_PASSWORD" matomo'
| gzip > $BACKUP_DIR/matomo_$DATE.sql.gz
# Also back up the matomo-data volume (plugins, config, uploads)
docker run --rm -v matomo_matomo-data:/data -v $BACKUP_DIR:/backup alpine
tar czf /backup/matomo-files_$DATE.tar.gz -C /data .
rclone copy $BACKUP_DIR/ danubedata:matomo-backups/ --include "*_$DATE.*"
find $BACKUP_DIR -mtime +7 -delete
EOF
chmod +x /opt/matomo/backup.sh
(crontab -l 2>/dev/null; echo "0 3 * * * /opt/matomo/backup.sh >> /var/log/matomo-backup.log 2>&1") | crontab -
Configure rclone with a DanubeData S3 access key (one-click in the dashboard). The €3.99/mo storage plan includes 1 TB — more than enough for years of analytics backups.
Part 6: Scaling and Operations
6.1 When to upgrade your VPS
Signs you have outgrown a plan:
- Umami: dashboard latency > 1 s, or Postgres CPU > 50% sustained. Upgrade to DD Small (€12.49) or move DB to managed Postgres.
- Matomo: archive cron falls behind (reports are more than 1 hour stale), or PHP workers 502 under concurrent load. Upgrade to DD Medium or split the archive cron onto a second VPS.
6.2 Horizontal scaling
Both tools scale horizontally once the database is off the app server:
- Umami: run N Umami containers behind a load balancer, all pointing at the same Postgres. Stateless by design.
- Matomo: add more PHP-FPM workers, use Redis for sessions/cache (Matomo supports it out of the box), and run the archive cron on a dedicated node so it does not starve web workers.
6.3 Monitoring your analytics
Ironic but important: you need monitoring on the thing that monitors your site. Basic approach:
- Uptime check (UptimeRobot, Better Stack, your own) hitting the dashboard URL every minute.
- Alert on 5xx rate from Caddy logs.
- Alert on disk > 80% full (Matomo archives can grow fast).
Real-World Cost Scenarios
| Scenario | Stack | Monthly Cost | Annual vs SaaS |
|---|---|---|---|
| Solo dev / indie blog | Umami on DD Nano (Docker Postgres) | €4.49 | Saves ~$55 vs Umami Cloud |
| Small agency, 5 client sites | Umami on DD Nano + managed Postgres | €24.48 | Saves €250+ vs billable setup |
| SMB with e-commerce | Matomo on DD Small + managed MariaDB | €32.48 | Saves €150+/mo vs Matomo Cloud |
| Agency with 20 clients | Matomo (DD Medium) + managed MariaDB + Umami (DD Nano) | ~€55/mo | Saves €500+/mo vs per-client SaaS |
| High-traffic news site | Matomo cluster: 2× DD Medium + managed MariaDB with replica | ~€120/mo | Saves €1,500+/mo vs GA360 alternatives |
Every DanubeData plan includes 20 TB of traffic, which is orders of magnitude more than analytics will ever use, and your first €50 is on us — enough to run a full year of an Umami install for free.
FAQ
Matomo, Umami, or Plausible — how do I decide?
Quick decision tree:
- You want the slickest, simplest experience: Plausible (see our Plausible guide).
- You want free, open-source, minimal, and run Postgres already: Umami.
- You need feature parity with GA4 or more — funnels, heatmaps, e-commerce, A/B tests, SEO: Matomo.
- You manage analytics for multiple clients with different needs: run both Umami and Matomo, each on their own subdomain.
Are self-hosted Matomo and Umami actually GDPR-compliant out of the box?
Umami: yes, by default. No cookies, no personal data stored, IPs are hashed, not retained. You still need a privacy policy entry, but you can skip the cookie banner for analytics.
Matomo: yes, with a two-minute config change. Enable disableCookies in the tracking code, set IP anonymization to 2 bytes in Privacy settings, and enable "respect DoNotTrack". CNIL (France) explicitly published a Matomo configuration guide that grants exemption from cookie consent.
Will I lose historical data if I switch from GA4?
For Matomo, not if you use the GA4 importer — though it is imperfect and slow. For Umami, yes, you start fresh. Most teams find historical data past 12-18 months is rarely consulted; the forward-looking data is what drives decisions.
Can I do cookieless tracking with these tools?
Umami is cookieless by default — there is no configuration needed. Matomo requires _paq.push(['disableCookies']); in your tracking snippet. Both identify unique visitors via a hashed signature (IP + User-Agent + daily salt) that cannot reconstruct a persistent identity across days or devices.
How much traffic can a €4.49 DanubeData VPS handle for Umami?
In practice: 5-10 million pageviews/month comfortably with Docker Postgres, limited more by DB disk I/O than CPU. Above that, move to managed Postgres (€19.99/mo) and keep the same tiny VPS for the app. We have customers processing 40M+ pageviews/month on a DD Small + managed Postgres.
How much traffic can Matomo handle on a €12.49 DD Small?
Around 2-5 million pageviews/month depending on how aggressive your custom reports and segmentation are. The bottleneck is almost always the archive cron, not request-handling. If archives fall behind, move to DD Medium or split the archive worker onto a second small VPS pointed at the same managed MariaDB.
Do I need a cookie banner with self-hosted Matomo/Umami?
In most EU member states: no, provided you (a) disable cookies, (b) anonymize IPs, (c) respect DNT, and (d) offer an opt-out mechanism. The ePrivacy Directive (the "cookie law") targets storage on end-user devices — if you do not store anything, it does not apply. The UK ICO, French CNIL, and German DPAs have published guidance confirming this for properly-configured Matomo, and Umami meets the criteria by default. Always verify with your own legal counsel for your jurisdiction and DPIA.
Can I run Matomo and Umami on the same VPS?
Technically yes — they use different ports and databases — but it is easier to keep them separate. If budget is tight, DD Small (€12.49) has enough RAM for both if you use Docker Postgres for Umami and a managed MariaDB for Matomo. For anything serious, put each on its own VPS.
What about real-time data?
Both tools show real-time visitor dashboards. Umami updates about every 5 seconds; Matomo updates every 10-15 seconds and has a dedicated "Visitors in real-time" widget. Neither samples data, unlike GA4's free tier.
How do I handle traffic spikes (HN/Reddit frontpage)?
Umami is effectively stateless — put Caddy in front, scale horizontally, point everything at managed Postgres. Matomo handles traffic bursts well because tracking requests are fast writes; the load hits hardest during archiving. Pre-scale before known events (product launch, press coverage) by bumping the VPS temporarily — DanubeData lets you resize with minimal downtime.
Get Started
You can be running self-hosted analytics in Falkenstein, Germany in under 30 minutes.
Umami quick-start
- Create a DD Nano VPS (€4.49/mo).
- Optional: add a managed Postgres (€19.99/mo) if you want daily snapshots without thinking about them.
- Follow Part 1 above.
Matomo quick-start
- Create a DD Small VPS (€12.49/mo).
- Provision a managed MariaDB (€19.99/mo).
- Follow Part 2 above.
Every DanubeData account starts with a €50 promotional credit — enough to run an Umami instance for nearly a year, or test-drive the full Matomo + managed DB stack for a month before paying a euro.
If you want a hand deploying either, our team is EU-based and replies within hours. Get in touch and we will help you pick the right stack and get it running.
Read next: Self-hosting Plausible Analytics on a VPS — the third option in the privacy-first analytics trio.