Tuesday, November 22, 2022

FastAPI on Azure Functions with Azure API Management

My very first job was on the Google Maps API, which was the world's most popular API at the time. We were still figuring out what made up a good API at the time, so we experimented with things like API keys, versioning, and auto-generated docs. We even created Google's first API quota system for rate-limiting keys, which was later used by all of Google's APIs. A lot of work went into creating a fully featured API!

That's why I love services like Azure API Management, since it makes it possible for any developer to create a fully featured API. (Other clouds have similar features, like AWS API Gateway and Google API Gateway). You can choose whether your API requires keys, then add API policies like rate-limiting, IP blocking, caching, and many more. At the non-free tier, you can also create a developer portal to receive API sign-ups.

That's also why I love the FastAPI framework, as it takes care of auto-generated interactive documentation and parameter validation.

I finally figured out how to combine the two together, along with Azure Functions as the backend, and that's what I want to dive into today.

This diagram shows the overall architecture:

Diagram of Azure API Management + Azure Functions + FastAPI architecture

Public vs Protected URLs

One of my goals was to have the documentation be publicly viewable (with no key) but the FastAPI API calls themselves require a subscription key. That split was the trickiest part of this whole architecture, and it started at the API Management level.

The API Management service consists of two "APIs" (as it calls them):

  • "simple-fastapi-api": This API is configured with subscriptionRequired: true and path: 'api'. All calls that come into the service prefixed with "api/" will get handled by this API.
  • "public-docs" This API isn't really an API, it's the gateway to the documentation and OpenAPI schema. It's configured with subscriptionRequired: false and path: 'public'.

The API Gateway uses the path prefixes to figure out how to handle the calls, but then it calls the API backend with whatever comes after the path.

This API Gateway URL calls this FastAPI URL:
"/api/generate_name" "/generate_name"
"/public/openapi.json" "/openapi.json"
"/public/docs" "/docs"

But that leads to a dreaded problem that you'll find all over the FastAPI issue tracker: when loading the docs page, FastAPI tries to load "/openapi.json" when it really needs to load "/public/openapi.json".

Fortunately, we can improve that by specifying root_path in the FastAPI constructor:

app = fastapi.FastAPI(root_path="/public")

Now, the docs will load successfully, but they'll claim the API calls are located at "/public/..." when they should be at "/api/...". That's fixed by another change to the FastAPI constructor, the addition of servers and root_path_in_servers:

app = fastapi.FastAPI(
        root_path="/public",
        servers=[{"url": "/api", "description": "API"}],
        root_path_in_servers=False,
      )

The servers option changes the OpenAPI schema so that all API calls are prefixed with "/api", whereas root_path_in_servers removes "/public" as a possible prefix for API calls. If that argument wasn't there, the FastAPI docs would present a dropdown with both "/api" and "/public" as options.

Since I only need this configuration when the API is running in production behind the API Management service, I setup my FastAPI app conditionally based on the current environment:

if os.getenv("FUNCTIONS_WORKER_RUNTIME"):
        app = fastapi.FastAPI(
            servers=[{"url": "/api", "description": "API"}],
            root_path="/public",
            root_path_in_servers=False,
        )
    else:
        app = fastapi.FastAPI()

It would probably also be possible to run some sort of local proxy that would mimic the API Management service, though I don't believe Azure offers an official local APIM emulator.



Securing the function

My next goal was to be able to do all of this with a secured Azure function, i.e. a function with an authLevel: 'function'. A secured function requires a "x-functions-key" header to be sent on every request, and for the value to be one of the key values in the function configuration.

Fortunately, API management makes it easy to always send a particular header along to an API's backend. Both of the APIs shared the same function backend, which I configured in Bicep like this:

resource apimBackend 'Microsoft.ApiManagement/service/backends@2021-12-01-preview' = {
  parent: apimService
  name: functionApp.name
  properties: {
    description: functionApp.name
    url: 'https://${functionApp.properties.hostNames[0]}'
    protocol: 'http'
    resourceId: '${environment().resourceManager}${functionApp.id}'
    credentials: {
      header: {
        'x-functions-key': [
          '{{function-app-key}}'
        ]
      }
    }
  }
}

But where does {{function-app-key}} come from? It refers to a "Named Value", a feature of API Management, which I configured like so:


resource apimNamedValuesKey 'Microsoft.ApiManagement/service/namedValues@2021-12-01-preview' = {
  parent: apimService
  name: 'function-app-key'
  properties: {
    displayName: 'function-app-key'
    value: listKeys('${functionApp.id}/host/default', '2019-08-01').functionKeys.default
    tags: ['key' 'function' 'auto']
    secret: true
  }
}

I could have also set it directly in the backend, but it's nice to make it a named value so that we can denote it as a value that should be kept secret.

The final step is to connect the APIs to the backend. Every API has a policy document written in XML (I know, old school!) One of the possible policies is set-backend-service which can be set to an Azure resource ID. I add the policy XML to both the APIs:

<set-backend-service id="apim-generated-policy" backend-id="${functionApp.name}" />

And that does it! See all the Bicep for the API Management in apimanagement.bicep.



All together now

The two trickiest parts were the API/docs distinction and passing on the function key, but there were a few other interesting aspects as well, like testing all the code (100% coverage!) and setting up the project to work with the Azure Developer CLI.

Here's a breakdown of the API code:

  • __init__.py: Called by Azure Functions, defines a main function that uses ASGIMiddleware to call the FastAPI app.
  • fastapi_app.py: Defines a function that returns a FastAPI app. I used a function for better testability.
  • fastapi_routes.py: Defines a function to handle an API call and uses fastapi.APIRouter to attach it to the "generate_name" route.
  • test_azurefunction.py: Uses Pytest to mock the Azure Functions context, mock environment variables, and check all the routes respond as expected.
  • test_fastapi.py: Uses Pytest to check the FastAPI API calls.
  • functions.json: Configuration for Azure Functions, declares that this function responds to an HTTP trigger and has a wildcard route.
  • local.settings.json: Used by the local Azure Functions emulator to determine the function runtime.

There's also a lot of interesting components in the Bicep files.

I hope that helps anyone else who's trying to deploy FastAPI to this sort of architecture. Let me know if you come up with a different approach or have any questions!

Friday, November 18, 2022

Running PostgreSQL in a Dev Container with Flask/Django

Not familiar with Dev Containers? Read my first post about Dev Containers.

I recently added Dev Container support to a Flask+PostgreSQL sample and a Django+PostgreSQL sample. I really wanted to be able to run PostgreSQL entirely inside the Dev Container, since 1) I've had a hard time trying to setup PostgreSQL on my laptop in the past, and 2) I'd like to access the database when running the Dev Container inside Github Codespaces on the web. Fortunately, thanks to some Docker magic, I figured it out! 🐳

The first step is to create a docker-compose.yml file inside the .devcontainer folder:

version: "3"

services:
  app:
    build:
      context: ..
      dockerfile: .devcontainer/Dockerfile
      args:
        VARIANT: 3.9
        USER_UID: 1000
        USER_GID: 1000

    volumes:
      - ..:/workspace:cached

    # Overrides default so things don't shut down after the process ends
    command: sleep infinity

    # Runs app on the same network as the database container,
    # allows "forwardPorts" in devcontainer.json function
    network_mode: service:db

  db:
    image: postgres:latest
    restart: unless-stopped
    volumes:
      - postgres-data:/var/lib/postgresql/data
    environment:
      POSTGRES_USER: app_user
      POSTGRES_DB: app
      POSTGRES_PASSWORD: app_password

volumes:
  postgres-data:

That file declares an app service based off a Dockerfile (which we'll see next) as well as a db service based off the official postgres image that stores data in a mounted volume. The db service declares environment variables for the database name, username, and password, which must be used by any app that connects to that database.

The next step is the Dockerfile, also inside the .devcontainer folder:

FROM mcr.microsoft.com/vscode/devcontainers/python:0-3

RUN curl -fsSL https://aka.ms/install-azd.sh | bash

ENV PYTHONUNBUFFERED 1

RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
    && apt-get -y install --no-install-recommends postgresql-client

This file is particular to the needs of my sample, so it builds on a pre-built Python-optimized Dev Container from Microsoft and includes the azd tool. Your own file could start from any Docker image that includes Python. However, you will want the final command in the file that installs a PostgreSQL client.

The final file inside the .devcontainer folder is devcontainer.json, which is the entry point for Github Codespaces and the VS Code Dev Containers extension, and describes all the customizations. Here's a simplified version of mine:

{
    "name": "Python 3 & PostgreSQL",
    "dockerComposeFile": "docker-compose.yml",
    "service": "app",
    "workspaceFolder": "/workspace",
    "forwardPorts": [8000, 5432],
    "extensions": [
        "ms-python.python",
        "mtxr.sqltools",
        "mtxr.sqltools-driver-pg"
    ],
    "settings": {
        "sqltools.connections": [{
            "name": "Container database",
            "driver": "PostgreSQL",
            "previewLimit": 50,
            "server": "localhost",
            "port": 5432,
            "database": "app",
            "username": "app_user",
            "password": "app_password"
        }],
        "python.pythonPath": "/usr/local/bin/python"
    }
}

Let's break that down:

  • dockerComposeFile points at the docker-compose.yml from earlier.
  • service matches the name of the non-postgres service from that file.
  • workspaceFolder matches the location of the volume from that file.
  • forwardPorts instructs the Dev Container to expose port 8000 (for the Django app) and port 5432 (for the PostGres DB).
  • extensions includes a really neat extension, SQLTools, which provides a graphical UI for the database tables and allows you to run queries against the tables.
  • Inside settings, sqltools.connections specifies the same database name, username, and password that was declared in the docker-compose.yml.

The final step is to make sure that the app knows how to access the PostgreSQL database. In my sample apps, the DB details are set via environment variables, so I just use a .env file that looks like this:

DBNAME=app
DBHOST=localhost
DBUSER=app_user
DBPASS=app_password

...And that's it! If you'd like, here's a video where I run a Flask+PostgreSQL Dev Container inside Github Codespaces and also highlight the changes described above:

Check out the sample repos for full code, and let me know if you run into issues enabling a PostgreSQL DB in your own Dev Containers.

Monday, November 7, 2022

Deploying a Django app to Azure with the Azure Developer CLI

Azure recently announced a new developer tool, the Azure Developer CLI, and after using it for the last month, I'm in love. ❤️

AZD makes it really easy to deploy applications that require multiple Azure services (which is most non-trivial applications!). Instead of running multiple commands or visiting multiple places in the Azure portal, I can run a single command and have it take care of everything. The azd commands rely on configuration files written in Bicep that describe all the resources, plus an azure.yaml file that describes what code needs to be deployed to which server. The azd tool also helps with CI/CD pipelines and app monitoring, so it's helpfulness goes beyond just the initial deployment.

The team is working on providing lots of templates that are already "azd-ified" so that you can get started quickly with whatever infrastructure stack you prefer. Since I'm on the Python team, I've been working on templates for the top Python frameworks, Django, Flask, and FastAPI.

My first finished template is for a Django app with PostgreSQL database that gets deployed to Azure App Service with a PostgreSQL flexible server. The App Service app communicates with the server inside a private VNet, to make sure the database can't be easily accessed from the outside.

Check out the sample here:
github.com/pamelafox/msdocs-django-postgresql-sample-app-azd/

The readme has instructions for local development, deployment, monitoring, and pipeline setup. If you run into any issues following the steps, let me know in the project discussions tab. The `azd` tool is still in preview, so this is a great time to get feedback about what is and isn't working.

I also put together this video showing me running azd up on the repo and setting up Django admin: