BlogTutorialsFastAPI Production Deployment: The Complete 2025 Guide

FastAPI Production Deployment: The Complete 2025 Guide

Adrian Silaghi
Adrian Silaghi
December 28, 2025
14 min read
23 views
#fastapi #python #deployment #docker #uvicorn #postgresql #redis #celery

FastAPI has become the go-to framework for building high-performance Python APIs. Its async support, automatic OpenAPI documentation, and type safety make it perfect for modern applications. But deploying FastAPI to production requires careful consideration of async workers, database connections, and performance optimization.

This guide shows you how to deploy a production-ready FastAPI application with everything you need: Docker, Uvicorn workers, PostgreSQL, Redis, background tasks, and monitoring.

What You'll Build

By the end of this guide, you'll have:

  • Production-ready FastAPI deployment with Docker
  • Uvicorn with Gunicorn for optimal performance
  • Managed PostgreSQL database with async connections
  • Redis for caching and background task queues
  • Celery or ARQ for background task processing
  • Prometheus metrics and health checks
  • SSL/TLS with automatic renewal
  • Zero-downtime deployments with GitHub Actions

Prerequisites

  • A FastAPI application ready to deploy
  • Basic familiarity with Docker and Python
  • A DanubeData account (or any VPS provider)

FastAPI vs Django vs Flask: When to Choose FastAPI

Feature FastAPI Django Flask
Performance Excellent (async) Good Good
API Development Excellent Good (DRF) Manual
Auto Documentation Built-in With DRF Manual
Type Safety Native Pydantic Optional Optional
Learning Curve Moderate Steeper Easy
Best For APIs, microservices Full-stack apps Simple apps

Choose FastAPI when: You're building APIs, need high performance, want auto-generated docs, or are working with async I/O.

Step 1: Prepare Your FastAPI Application

Project Structure

my-fastapi-app/
├── app/
│   ├── __init__.py
│   ├── main.py           # FastAPI application
│   ├── config.py         # Settings management
│   ├── database.py       # Database connection
│   ├── models/           # SQLAlchemy models
│   ├── schemas/          # Pydantic schemas
│   ├── routers/          # API routes
│   ├── services/         # Business logic
│   └── tasks/            # Background tasks
├── tests/
├── alembic/              # Database migrations
├── Dockerfile
├── docker-compose.yml
├── requirements.txt
└── .env

Configuration with Pydantic Settings

# app/config.py
from pydantic_settings import BaseSettings
from functools import lru_cache

class Settings(BaseSettings):
    # Application
    app_name: str = "My FastAPI App"
    debug: bool = False

    # Database
    database_url: str
    database_pool_size: int = 5
    database_max_overflow: int = 10

    # Redis
    redis_url: str

    # Security
    secret_key: str
    algorithm: str = "HS256"
    access_token_expire_minutes: int = 30

    class Config:
        env_file = ".env"

@lru_cache()
def get_settings():
    return Settings()

Async Database Connection with SQLAlchemy 2.0

# app/database.py
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.orm import sessionmaker, declarative_base
from app.config import get_settings

settings = get_settings()

# Use asyncpg for PostgreSQL
engine = create_async_engine(
    settings.database_url.replace("postgresql://", "postgresql+asyncpg://"),
    pool_size=settings.database_pool_size,
    max_overflow=settings.database_max_overflow,
    pool_pre_ping=True,
    echo=settings.debug,
)

AsyncSessionLocal = sessionmaker(
    engine,
    class_=AsyncSession,
    expire_on_commit=False
)

Base = declarative_base()

async def get_db():
    async with AsyncSessionLocal() as session:
        try:
            yield session
            await session.commit()
        except Exception:
            await session.rollback()
            raise
        finally:
            await session.close()

Main Application with Lifespan Events

# app/main.py
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
import redis.asyncio as redis

from app.config import get_settings
from app.database import engine, Base
from app.routers import users, items, health

settings = get_settings()

# Redis connection pool
redis_pool = None

@asynccontextmanager
async def lifespan(app: FastAPI):
    # Startup
    global redis_pool
    redis_pool = redis.ConnectionPool.from_url(
        settings.redis_url,
        max_connections=10,
        decode_responses=True
    )

    # Create tables (use Alembic in production)
    async with engine.begin() as conn:
        await conn.run_sync(Base.metadata.create_all)

    yield

    # Shutdown
    await redis_pool.disconnect()
    await engine.dispose()

app = FastAPI(
    title=settings.app_name,
    lifespan=lifespan,
    docs_url="/api/docs" if settings.debug else None,
    redoc_url="/api/redoc" if settings.debug else None,
)

# CORS
app.add_middleware(
    CORSMiddleware,
    allow_origins=["https://yourdomain.com"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

# Routers
app.include_router(health.router, tags=["health"])
app.include_router(users.router, prefix="/api/v1/users", tags=["users"])
app.include_router(items.router, prefix="/api/v1/items", tags=["items"])

# Redis dependency
async def get_redis():
    return redis.Redis(connection_pool=redis_pool)

Health Check Endpoint

# app/routers/health.py
from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import text
import redis.asyncio as redis

from app.database import get_db
from app.main import get_redis

router = APIRouter()

@router.get("/health")
async def health_check():
    return {"status": "healthy"}

@router.get("/health/ready")
async def readiness_check(
    db: AsyncSession = Depends(get_db),
    redis_client: redis.Redis = Depends(get_redis)
):
    # Check database
    try:
        await db.execute(text("SELECT 1"))
        db_status = "connected"
    except Exception as e:
        db_status = f"error: {str(e)}"

    # Check Redis
    try:
        await redis_client.ping()
        redis_status = "connected"
    except Exception as e:
        redis_status = f"error: {str(e)}"

    return {
        "status": "ready" if db_status == "connected" and redis_status == "connected" else "degraded",
        "database": db_status,
        "redis": redis_status
    }

Step 2: Create Docker Configuration

Dockerfile (Production)

# Dockerfile
FROM python:3.12-slim AS base

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1 
    PYTHONUNBUFFERED=1 
    PIP_NO_CACHE_DIR=1 
    PIP_DISABLE_PIP_VERSION_CHECK=1

WORKDIR /app

# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends 
    gcc 
    libpq-dev 
    && rm -rf /var/lib/apt/lists/*

# =====================
# Builder stage
# =====================
FROM base AS builder

# Install Python dependencies
COPY requirements.txt .
RUN pip install --user -r requirements.txt

# =====================
# Production stage
# =====================
FROM base AS production

# Copy Python packages from builder
COPY --from=builder /root/.local /root/.local
ENV PATH=/root/.local/bin:$PATH

# Copy application code
COPY app/ ./app/
COPY alembic/ ./alembic/
COPY alembic.ini .

# Create non-root user
RUN useradd --create-home appuser && chown -R appuser:appuser /app
USER appuser

EXPOSE 8000

# Use Gunicorn with Uvicorn workers
CMD ["gunicorn", "app.main:app", 
     "--bind", "0.0.0.0:8000", 
     "--workers", "4", 
     "--worker-class", "uvicorn.workers.UvicornWorker", 
     "--access-logfile", "-", 
     "--error-logfile", "-", 
     "--capture-output", 
     "--timeout", "120"]

Requirements File

# requirements.txt
fastapi==0.109.0
uvicorn[standard]==0.27.0
gunicorn==21.2.0
pydantic-settings==2.1.0

# Database
sqlalchemy[asyncio]==2.0.25
asyncpg==0.29.0
alembic==1.13.1

# Redis
redis[hiredis]==5.0.1

# Security
python-jose[cryptography]==3.3.0
passlib[bcrypt]==1.7.4

# Background tasks (choose one)
celery[redis]==5.3.6
# or
arq==0.25.0

# Monitoring
prometheus-fastapi-instrumentator==6.1.0

# Testing
pytest==7.4.4
pytest-asyncio==0.23.3
httpx==0.26.0

Docker Compose Stack

# docker-compose.yml
services:
  app:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    environment:
      - DATABASE_URL=${DATABASE_URL}
      - REDIS_URL=${REDIS_URL}
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=false
    ports:
      - "8000:8000"
    depends_on:
      - redis
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
      interval: 30s
      timeout: 10s
      retries: 3
      start_period: 40s
    networks:
      - app-network

  celery-worker:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    command: celery -A app.tasks.celery worker --loglevel=info --concurrency=4
    environment:
      - DATABASE_URL=${DATABASE_URL}
      - REDIS_URL=${REDIS_URL}
      - SECRET_KEY=${SECRET_KEY}
    depends_on:
      - redis
    networks:
      - app-network

  celery-beat:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    command: celery -A app.tasks.celery beat --loglevel=info
    environment:
      - DATABASE_URL=${DATABASE_URL}
      - REDIS_URL=${REDIS_URL}
      - SECRET_KEY=${SECRET_KEY}
    depends_on:
      - redis
    networks:
      - app-network

  redis:
    image: redis:7-alpine
    restart: always
    volumes:
      - redis-data:/data
    networks:
      - app-network

  caddy:
    image: caddy:2-alpine
    restart: always
    ports:
      - "80:80"
      - "443:443"
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile
      - caddy-data:/data
      - caddy-config:/config
    depends_on:
      - app
    networks:
      - app-network

volumes:
  redis-data:
  caddy-data:
  caddy-config:

networks:
  app-network:
    driver: bridge

Caddyfile for Automatic HTTPS

# Caddyfile
yourdomain.com {
    reverse_proxy app:8000

    encode gzip

    header {
        X-Content-Type-Options nosniff
        X-Frame-Options DENY
        Referrer-Policy strict-origin-when-cross-origin
        -Server
    }

    log {
        output stdout
        format console
    }
}

Step 3: Set Up Managed Services

For production reliability, use managed PostgreSQL and Redis instead of self-hosting:

Create PostgreSQL Database

  1. Log into DanubeData dashboard
  2. Navigate to Databases → Create Database
  3. Select PostgreSQL 16, choose your plan (Small: €19.99/mo)
  4. Copy the connection string

Create Redis Cache

  1. Navigate to Cache → Create Cache Instance
  2. Select Redis or Valkey (Small: €9.99/mo)
  3. Copy the connection credentials

Environment Configuration

# .env.production
DEBUG=false
SECRET_KEY=your-super-secret-key-here

# DanubeData PostgreSQL
DATABASE_URL=postgresql://user:password@your-db.danubedata.com:5432/database?sslmode=require

# DanubeData Redis
REDIS_URL=rediss://:password@your-cache.danubedata.com:6379/0

# Celery
CELERY_BROKER_URL=rediss://:password@your-cache.danubedata.com:6379/1

Step 4: Background Tasks with Celery

Celery Configuration

# app/tasks/celery.py
from celery import Celery
from app.config import get_settings

settings = get_settings()

celery = Celery(
    "tasks",
    broker=settings.redis_url.replace("/0", "/1"),  # Use DB 1 for Celery
    backend=settings.redis_url.replace("/0", "/2"),  # Use DB 2 for results
    include=["app.tasks.email", "app.tasks.reports"]
)

celery.conf.update(
    task_serializer="json",
    accept_content=["json"],
    result_serializer="json",
    timezone="UTC",
    enable_utc=True,
    task_track_started=True,
    task_time_limit=300,
    worker_prefetch_multiplier=1,
)

# Scheduled tasks
celery.conf.beat_schedule = {
    "cleanup-expired-tokens": {
        "task": "app.tasks.cleanup.cleanup_expired_tokens",
        "schedule": 3600.0,  # Every hour
    },
    "generate-daily-report": {
        "task": "app.tasks.reports.generate_daily_report",
        "schedule": crontab(hour=6, minute=0),  # 6 AM daily
    },
}

Example Task

# app/tasks/email.py
from app.tasks.celery import celery

@celery.task(bind=True, max_retries=3)
def send_email(self, to_email: str, subject: str, body: str):
    try:
        # Your email sending logic
        print(f"Sending email to {to_email}")
        # ... send email ...
        return {"status": "sent", "to": to_email}
    except Exception as e:
        raise self.retry(exc=e, countdown=60)

# Usage in FastAPI
@router.post("/users/")
async def create_user(user: UserCreate, db: AsyncSession = Depends(get_db)):
    # Create user...

    # Queue welcome email (non-blocking)
    send_email.delay(
        to_email=user.email,
        subject="Welcome!",
        body="Thanks for signing up..."
    )

    return user

Step 5: Performance Optimization

Uvicorn Workers Calculation

For CPU-bound tasks, use this formula:

# workers = (2 * CPU cores) + 1
# For a 4-core VPS: workers = 9

# However, for I/O-bound FastAPI apps, start with:
# workers = CPU cores * 2

# In Dockerfile or docker-compose:
CMD ["gunicorn", "app.main:app", 
     "--workers", "4",   # Adjust based on your VPS
     "--worker-class", "uvicorn.workers.UvicornWorker"]

Connection Pooling Best Practices

# Database pool sizing
# Rule of thumb: pool_size = workers * 2, max_overflow = pool_size

# For 4 workers:
engine = create_async_engine(
    database_url,
    pool_size=10,        # Base connections
    max_overflow=10,     # Extra connections when busy
    pool_pre_ping=True,  # Check connection health
    pool_recycle=3600,   # Recycle connections after 1 hour
)

Redis Caching Decorator

# app/utils/cache.py
import json
from functools import wraps
from typing import Optional
import redis.asyncio as redis

def cached(ttl: int = 300, key_prefix: str = ""):
    def decorator(func):
        @wraps(func)
        async def wrapper(*args, redis_client: redis.Redis, **kwargs):
            # Build cache key
            cache_key = f"{key_prefix}:{func.__name__}:{hash(str(args) + str(kwargs))}"

            # Try cache
            cached_value = await redis_client.get(cache_key)
            if cached_value:
                return json.loads(cached_value)

            # Execute function
            result = await func(*args, **kwargs)

            # Store in cache
            await redis_client.setex(cache_key, ttl, json.dumps(result))

            return result
        return wrapper
    return decorator

# Usage
@cached(ttl=600, key_prefix="users")
async def get_user_stats(user_id: int) -> dict:
    # Expensive query
    return {"user_id": user_id, "stats": ...}

Step 6: Monitoring and Observability

Prometheus Metrics

# app/main.py
from prometheus_fastapi_instrumentator import Instrumentator

app = FastAPI(...)

# Add Prometheus metrics
Instrumentator().instrument(app).expose(app, endpoint="/metrics")

Structured Logging

# app/logging.py
import logging
import sys
from pythonjsonlogger import jsonlogger

def setup_logging():
    handler = logging.StreamHandler(sys.stdout)
    handler.setFormatter(jsonlogger.JsonFormatter(
        "%(asctime)s %(levelname)s %(name)s %(message)s"
    ))

    logging.root.handlers = [handler]
    logging.root.setLevel(logging.INFO)

    # Reduce noise from libraries
    logging.getLogger("uvicorn.access").setLevel(logging.WARNING)
    logging.getLogger("sqlalchemy.engine").setLevel(logging.WARNING)

Step 7: Deploy to VPS

Initial Server Setup

# SSH into your VPS
ssh root@your-vps-ip

# Install Docker
curl -fsSL https://get.docker.com | sh
apt install docker-compose-plugin -y

# Create app directory
mkdir -p /var/www/fastapi
cd /var/www/fastapi

# Clone repository
git clone https://github.com/your-org/your-app.git .

# Create .env file
cp .env.example .env
nano .env  # Add production values

# Build and start
docker compose up -d --build

# Run migrations
docker compose exec app alembic upgrade head

# Check status
docker compose ps
docker compose logs -f app

GitHub Actions CI/CD

# .github/workflows/deploy.yml
name: Deploy to Production

on:
  push:
    branches: [main]

jobs:
  test:
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:16
        env:
          POSTGRES_PASSWORD: test
          POSTGRES_DB: test
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
        ports:
          - 5432:5432

    steps:
      - uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: "3.12"

      - name: Install dependencies
        run: pip install -r requirements.txt

      - name: Run tests
        run: pytest tests/ -v
        env:
          DATABASE_URL: postgresql://postgres:test@localhost:5432/test
          REDIS_URL: redis://localhost:6379/0
          SECRET_KEY: test-secret-key

  deploy:
    needs: test
    runs-on: ubuntu-latest
    steps:
      - name: Deploy to VPS
        uses: appleboy/ssh-action@v1.0.3
        with:
          host: ${{ secrets.VPS_HOST }}
          username: ${{ secrets.VPS_USER }}
          key: ${{ secrets.VPS_SSH_KEY }}
          script: |
            cd /var/www/fastapi
            git pull origin main
            docker compose build app
            docker compose exec app alembic upgrade head
            docker compose up -d --no-deps app celery-worker
            docker compose exec app curl -f http://localhost:8000/health || exit 1

Architecture Overview

┌─────────────────────────────────────────────────────────────┐
│                         Internet                             │
└─────────────────────────────┬───────────────────────────────┘
                              │
                              ▼
┌─────────────────────────────────────────────────────────────┐
│                    Caddy (Reverse Proxy)                     │
│                    Auto SSL, Load Balance                    │
└─────────────────────────────┬───────────────────────────────┘
                              │
                              ▼
┌─────────────────────────────────────────────────────────────┐
│                    FastAPI Application                       │
│  ┌───────────────────────────────────────────────────────┐  │
│  │  Gunicorn (Process Manager)                           │  │
│  │  ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐     │  │
│  │  │Uvicorn 1│ │Uvicorn 2│ │Uvicorn 3│ │Uvicorn 4│     │  │
│  │  └─────────┘ └─────────┘ └─────────┘ └─────────┘     │  │
│  └───────────────────────────────────────────────────────┘  │
└───────────────────────────────┬─────────────────────────────┘
                                │
                ┌───────────────┴───────────────┐
                │                               │
                ▼                               ▼
┌───────────────────────────┐   ┌───────────────────────────┐
│      PostgreSQL           │   │         Redis             │
│     (DanubeData)          │   │      (DanubeData)         │
│      €19.99/mo            │   │       €9.99/mo            │
│                           │   │                           │
│  • Async connections      │   │  • Caching                │
│  • Automated backups      │   │  • Celery broker          │
│  • Connection pooling     │   │  • Session storage        │
└───────────────────────────┘   └───────────────────────────┘
                                          │
                                          ▼
                                ┌───────────────────────────┐
                                │    Celery Workers         │
                                │    Background Tasks       │
                                └───────────────────────────┘

Cost Summary

Component Service Monthly Cost
Application Server VPS Medium (4 vCPU, 8GB) €17.99
Database PostgreSQL Small €19.99
Cache/Queue Redis Small €9.99
Total €47.97/month

Common Pitfalls and Solutions

1. Connection Pool Exhaustion

# Problem: "too many connections" error
# Solution: Properly size connection pool

# Bad: No pool limits
engine = create_async_engine(database_url)

# Good: Explicit pool configuration
engine = create_async_engine(
    database_url,
    pool_size=10,
    max_overflow=20,
    pool_timeout=30,
    pool_recycle=1800,
)

2. Blocking the Event Loop

# Bad: Blocking call in async function
@app.get("/users/{user_id}")
async def get_user(user_id: int):
    user = db.query(User).filter(User.id == user_id).first()  # Blocking!
    return user

# Good: Use async database calls
@app.get("/users/{user_id}")
async def get_user(user_id: int, db: AsyncSession = Depends(get_db)):
    result = await db.execute(select(User).where(User.id == user_id))
    return result.scalar_one_or_none()

3. Missing Request Validation

# Bad: Trust user input
@app.post("/items/")
async def create_item(item: dict):
    return await db.execute(...)

# Good: Pydantic validation
class ItemCreate(BaseModel):
    name: str = Field(..., min_length=1, max_length=100)
    price: float = Field(..., gt=0)
    quantity: int = Field(default=0, ge=0)

@app.post("/items/")
async def create_item(item: ItemCreate):
    return await db.execute(...)

Get Started

Ready to deploy your FastAPI application?

👉 Create your DanubeData account

Deploy your complete FastAPI stack in under 15 minutes:

  • VPS with Docker pre-installed
  • Managed PostgreSQL with async support
  • Redis for caching and Celery

All for under €50/month with European data residency and automatic backups included.

Need help deploying FastAPI? Contact our team—we're Python developers too.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.