BlogTutorialsDeploy a Python FastAPI App to Serverless Containers in 5 Minutes

Deploy a Python FastAPI App to Serverless Containers in 5 Minutes

Adrian Silaghi
Adrian Silaghi
February 24, 2026
8 min read
2 views
#fastapi #python #serverless #deployment #knative #postgresql #api #buildpacks

What if you could deploy a FastAPI application without provisioning servers, writing Dockerfiles, or configuring reverse proxies? With DanubeData Rapids, you push your code and get a live, auto-scaling API endpoint in minutes.

This tutorial walks you through deploying a FastAPI app to serverless containers from scratch. We will create a simple API, connect it to a PostgreSQL database, and deploy it with a single git push — all in under 5 minutes.

Why Serverless for FastAPI?

Traditional FastAPI deployments require you to manage VPS instances, configure Gunicorn workers, set up reverse proxies, and handle SSL certificates. Serverless containers eliminate all of that:

  • Scale to zero — pay nothing when your API has no traffic
  • Auto-scale — handle traffic spikes without manual intervention
  • No infrastructure — no servers to patch, no Docker to configure
  • Built-in HTTPS — automatic TLS on every endpoint
  • Git-based deploys — push code, get a live URL

What You'll Build

By the end of this tutorial, you will have:

  • A FastAPI application deployed to DanubeData Rapids
  • Automatic builds from your Git repository using Buildpacks
  • A PostgreSQL database connected via internal cluster DNS
  • Environment variables for configuration
  • A custom domain with automatic TLS
  • A live HTTPS endpoint at https://your-app.danubedata.run

Prerequisites

  • A DanubeData account (free tier available)
  • A GitHub, GitLab, or Bitbucket account
  • Python 3.10+ installed locally
  • Basic familiarity with FastAPI

Step 1: Create Your FastAPI Application

Let's build a simple task management API. Create a new directory and set up the project:

mkdir fastapi-serverless && cd fastapi-serverless
python -m venv venv
source venv/bin/activate

Install Dependencies

pip install fastapi uvicorn sqlalchemy psycopg2-binary pydantic-settings

Create requirements.txt

# requirements.txt
fastapi==0.115.0
uvicorn[standard]==0.32.0
sqlalchemy==2.0.36
psycopg2-binary==2.9.10
pydantic-settings==2.6.0

DanubeData Buildpacks automatically detect requirements.txt and install your dependencies — no Dockerfile needed.

Create the Application

# app.py
import os
from contextlib import asynccontextmanager
from fastapi import FastAPI, Depends, HTTPException
from pydantic import BaseModel
from pydantic_settings import BaseSettings
from sqlalchemy import create_engine, Column, Integer, String, Boolean
from sqlalchemy.orm import sessionmaker, Session, declarative_base


class Settings(BaseSettings):
    database_url: str = "sqlite:///./local.db"
    app_name: str = "Task API"


settings = Settings()

engine = create_engine(settings.database_url)
SessionLocal = sessionmaker(bind=engine)
Base = declarative_base()


class Task(Base):
    __tablename__ = "tasks"
    id = Column(Integer, primary_key=True, index=True)
    title = Column(String(200), nullable=False)
    description = Column(String(1000), default="")
    completed = Column(Boolean, default=False)


@asynccontextmanager
async def lifespan(app: FastAPI):
    Base.metadata.create_all(bind=engine)
    yield


app = FastAPI(title=settings.app_name, lifespan=lifespan)


def get_db():
    db = SessionLocal()
    try:
        yield db
    finally:
        db.close()


class TaskCreate(BaseModel):
    title: str
    description: str = ""


class TaskResponse(BaseModel):
    id: int
    title: str
    description: str
    completed: bool

    class Config:
        from_attributes = True


@app.get("/")
def root():
    return {"service": "Task API", "status": "running"}


@app.get("/tasks", response_model=list[TaskResponse])
def list_tasks(db: Session = Depends(get_db)):
    return db.query(Task).all()


@app.post("/tasks", response_model=TaskResponse, status_code=201)
def create_task(task: TaskCreate, db: Session = Depends(get_db)):
    db_task = Task(title=task.title, description=task.description)
    db.add(db_task)
    db.commit()
    db.refresh(db_task)
    return db_task


@app.patch("/tasks/{task_id}", response_model=TaskResponse)
def toggle_task(task_id: int, db: Session = Depends(get_db)):
    task = db.query(Task).filter(Task.id == task_id).first()
    if not task:
        raise HTTPException(status_code=404, detail="Task not found")
    task.completed = not task.completed
    db.commit()
    db.refresh(task)
    return task


@app.delete("/tasks/{task_id}", status_code=204)
def delete_task(task_id: int, db: Session = Depends(get_db)):
    task = db.query(Task).filter(Task.id == task_id).first()
    if not task:
        raise HTTPException(status_code=404, detail="Task not found")
    db.delete(task)
    db.commit()

Create the Procfile

The Procfile tells DanubeData how to start your application. FastAPI must listen on the PORT environment variable provided by the platform:

# Procfile
web: uvicorn app:app --host 0.0.0.0 --port $PORT

That's it for the application code. Three files: app.py, requirements.txt, and Procfile.

Step 2: Push to Git

Initialize a Git repository and push to GitHub (or GitLab/Bitbucket):

git init
git add app.py requirements.txt Procfile
git commit -m "Initial FastAPI app"
git remote add origin https://github.com/your-username/fastapi-serverless.git
git push -u origin main

Step 3: Deploy to DanubeData Rapids

  1. Log into your DanubeData Dashboard
  2. Navigate to Rapids (Serverless Containers)
  3. Click Create Container
  4. Select Git Repository as the deployment source
  5. Connect your GitHub account and select the fastapi-serverless repository
  6. Choose the main branch
  7. Select the Micro resource profile to start (0.25 vCPU, 256MB RAM — plenty for an API)
  8. Click Deploy

DanubeData uses Paketo Buildpacks to automatically detect your Python project. It finds requirements.txt, installs your dependencies, reads your Procfile, and builds a production container — all without a Dockerfile.

Within a minute or two, your API is live at:

https://fastapi-serverless-your-team.danubedata.run

Test it:

curl https://fastapi-serverless-your-team.danubedata.run/
# {"service": "Task API", "status": "running"}

curl https://fastapi-serverless-your-team.danubedata.run/docs
# Interactive Swagger UI

FastAPI's auto-generated documentation is immediately available at /docs — perfect for sharing with your team.

Step 4: Add a PostgreSQL Database

SQLite works for testing, but production APIs need a real database. DanubeData PostgreSQL instances are accessible from Rapids containers via internal cluster DNS — no public internet roundtrip, low latency.

Create a PostgreSQL Instance

  1. Go to Databases in the dashboard
  2. Click Create Database
  3. Select PostgreSQL 16
  4. Choose the Small plan (1 vCPU, 2GB RAM — starts at €19.99/mo)
  5. Note your connection credentials: host, port, username, password, and database name

Your database is accessible from Rapids containers via the internal hostname, something like:

postgresql-xxxxx.tenant-123.svc.cluster.local

Set Environment Variables

In the Rapids dashboard, navigate to your container and click Environment Variables. Add:

Variable Value
DATABASE_URL postgresql://user:password@postgresql-xxxxx.tenant-123.svc.cluster.local:5432/mydb
APP_NAME Task API Production

The container automatically redeploys with the new environment variables. Your FastAPI app picks them up via pydantic-settings — no code changes needed. The Settings class reads DATABASE_URL from the environment and connects to PostgreSQL instead of SQLite.

Step 5: Set Up a Custom Domain

Your container runs at https://fastapi-serverless-your-team.danubedata.run by default. To use your own domain:

  1. In the Rapids container settings, click Custom Domains
  2. Enter your domain, for example api.yourdomain.com
  3. Add a CNAME record in your DNS provider:
    api.yourdomain.com  CNAME  fastapi-serverless-your-team.danubedata.run
    
  4. DanubeData automatically provisions and renews a TLS certificate for your domain

Within a few minutes, your API is accessible at https://api.yourdomain.com with a valid SSL certificate.

Step 6: Enable Auto-Deploy on Push

DanubeData can automatically rebuild and deploy your container whenever you push to your Git repository.

  1. In your container settings, enable Auto-deploy on push
  2. DanubeData adds a webhook to your repository
  3. Every git push to your configured branch triggers a new build

Your workflow becomes:

# Make changes locally
git add .
git commit -m "Add new endpoint"
git push origin main

# That's it — DanubeData builds and deploys automatically

No CI/CD pipeline to configure. No deployment scripts. Push code, get a new version live.

Step 7: Monitor Your Application

DanubeData provides built-in monitoring for Rapids containers:

  • Request metrics — requests per second, latency percentiles, error rates
  • Resource usage — CPU and memory consumption per instance
  • Scaling events — when instances scale up or down
  • Build logs — full output from Buildpack builds
  • Application logs — stdout/stderr from your running container

You can also add a health check endpoint to your FastAPI app for the platform to monitor:

@app.get("/health")
def health_check():
    try:
        db = SessionLocal()
        db.execute(text("SELECT 1"))
        db.close()
        return {"status": "healthy", "database": "connected"}
    except Exception:
        raise HTTPException(status_code=503, detail="Database unavailable")

Complete Project Structure

Here is the final project — just three files:

fastapi-serverless/
├── app.py              # FastAPI application
├── requirements.txt    # Python dependencies
└── Procfile            # Start command

Compare this to a traditional deployment that requires a Dockerfile, docker-compose.yml, Caddyfile, gunicorn.conf.py, systemd service files, and CI/CD workflows. Serverless containers remove all of that operational overhead.

Pricing: Pay for What You Use

DanubeData Rapids uses a pay-per-use model with a generous free tier:

Resource Price Free Tier
vCPU €0.000012 / vCPU-second 250K vCPU-seconds/month
Memory €0.000002 / GiB-second 500K GiB-seconds/month
Requests €0.12 / million requests 2M requests/month
Egress Standard rates 5GB/month

For a typical low-traffic API (a few thousand requests per day), you can stay entirely within the free tier. When traffic grows, costs scale linearly — no surprise bills from idle servers.

If you prefer predictable pricing, DanubeData also offers fixed resource profiles:

Profile vCPU Memory Price
Micro 0.25 256 MB €5/month
Small 0.5 512 MB €10/month
Medium 1 1 GB €20/month
Large 2 2 GB €40/month

Serverless vs VPS: When to Use What

Criteria Rapids (Serverless) VPS
Traffic pattern Variable / bursty Steady / predictable
Deployment speed Minutes (git push) 30-60 minutes (setup)
Infrastructure management None Full control
Scale to zero Yes No (always running)
Cost at zero traffic €0 €4.49+/month
Background workers Not ideal Full support (Celery, etc.)
Best for APIs, webhooks, microservices Full-stack apps, workers

FastAPI's lightweight nature and fast cold starts make it an excellent fit for serverless containers. If your API handles bursty traffic or you want zero operational overhead, Rapids is the way to go.

Tips for Production FastAPI on Serverless

1. Keep Cold Starts Fast

Serverless containers start fresh instances on demand. Keep your startup time low:

# Good: Lazy database connections (connect on first request)
@asynccontextmanager
async def lifespan(app: FastAPI):
    Base.metadata.create_all(bind=engine)
    yield

# Avoid: Heavy imports or pre-loading at module level
# Don't import ML models or large datasets at startup

2. Use Pydantic Settings for Configuration

Never hardcode connection strings. Use environment variables via pydantic-settings:

class Settings(BaseSettings):
    database_url: str = "sqlite:///./local.db"
    redis_url: str = ""
    api_key: str = ""
    debug: bool = False

This lets you use SQLite locally and PostgreSQL in production without changing code.

3. Structure for Multiple Endpoints

As your API grows, use FastAPI's router system to keep things organized:

# routers/tasks.py
from fastapi import APIRouter

router = APIRouter(prefix="/tasks", tags=["tasks"])

@router.get("/")
def list_tasks():
    ...

# app.py
from routers import tasks, users
app.include_router(tasks.router)
app.include_router(users.router)

4. Handle Database Connections Properly

In serverless environments, connections can be recycled between requests. Use connection pooling:

engine = create_engine(
    settings.database_url,
    pool_size=5,
    max_overflow=10,
    pool_pre_ping=True,    # Verify connections before use
    pool_recycle=300,       # Recycle connections every 5 minutes
)

Get Started

You have seen how to go from zero to a deployed FastAPI API in under 5 minutes:

  1. Write your FastAPI app
  2. Add requirements.txt and Procfile
  3. Push to Git
  4. Deploy on DanubeData Rapids
  5. Connect your database and custom domain

No Dockerfile. No reverse proxy. No server management. Just your Python code and a git push.

Create your free DanubeData account and deploy your first FastAPI app today. The free tier gives you 2 million requests per month — enough to build and launch without spending a cent.

Already running FastAPI on a VPS? Check out our complete FastAPI production deployment guide for optimizing your existing setup.

Questions about deploying FastAPI? Reach out to our team — we are Python developers too.

Share this article

Ready to Get Started?

Deploy your infrastructure in minutes with DanubeData's managed services.