.serve method allows you to easily elevate a flow to a deployment, listening for scheduled work to execute as a local process.
However, this “local” process does not need to be on your local machine. In this example we show how to run a flow in Docker container on your local machine, but you could use a Docker container on any machine that has Docker installed.
Overview
In this example, you will set up:- a simple flow that retrieves the number of stars for some GitHub repositories
- a
Dockerfilethat packages up your flow code and dependencies into a container image
Writing the flow
Say we have a flow that retrieves the number of stars for a GitHub repository:serve_retrieve_github_stars.py
Writing the Dockerfile
Assuming we have our Python requirements defined in a file:requirements.txt
Dockerfile.
Using
pip, the image is built in about 20 seconds, and using uv, the image is built in about 3 seconds.You can learn more about using uv in the Astral documentation.Build and run the container
Now that we have a flow and a Dockerfile, we can build the image from the Dockerfile and run a container from this image.Build (and push) the image
We can build the image with thedocker build command and the -t flag to specify a name for the image.
Run the container
You’ll likely want to inject some environment variables into your container, so let’s define a.env file:
.env
Verify the container is running
CONTAINER ID as we’ll need it to view logs.
View logs
Stop the container
Next steps
Congratulations! You have packaged and served a flow on a long-lived Docker container. You may now easily deploy this container to other infrastructures, such as:- Modal
- Google Cloud Run
- AWS Fargate / ECS
- Managed Kubernetes (For example: GKE, EKS, or AKS)