Status
State: Draft
Discussion thread: https://lists.apache.org/thread.html/7af1a4faa4baa119a124cec0920c2d6e4b7b6c91d7fa5b7ce0d1c1d6@%3Cdev.airflow.apache.org%3E
JIRA: AIRFLOW-3718
Motivation
Current official Airflow image is rebuilt from the scratch every time new commit is done to the repo. It is a "mono-layered" one and does not use Docker's multi-layer architecture nor multi-stage Docker architecture.
Mono-layered image means that builds after only small changes take as long as full build rather than utilise caching and only rebuild what's needed.
With multi-layered approach and caching enabled in Docker Hub we can optimise it to download only the layers that changed. This enables the users using the images to download only incremental changes, and opens up a number of options how such incremental build/download process can be utilised:
- Multi-layered images can be used as based for AIP-7 Simplified development workflow - where locally downloaded images are used during development and they are incrementally updated quickly during development with newly added dependencies.
- Multi-layered images being part of the "airflow" project can be used to run Travis CI integration tests (simplifying the idea described in Optimizing Docker Image Workflow ). Having incremental builds will allow DockerHub registry to be used as source for base images (pulled before build) to build locally final image used for test execution in an incremental way.
- Why initially the images are not meant to be used in production, using multi-staging, variable arguments and multiple layers to produce production-ready Airflow image that can be used to pre-bake Dags into the image - thus making Airflow closer to be Kubernetes-native. This has been discussed as potential future improvement in AIP-12 Persist DAG into DB
- Ideally both Airflow and CI images should be maintained in single place - "source of truth" to ease maintenance and development. Currently they are maintained in separate repositories and have potentially different dependencies and build process. It also makes it difficult to add your own dependencies during development as there is no regular/development friendly process to update CI image with new dependencies.
Considerations
In the PR : https://github.com/apache/airflow/pull/4543 the current mono-layered docker has been implemented as multi-layered one. The PR uses "hooks/build" hook that is used by DockerHub build process to control caching and build process. Thanks to that we can build different variants of the images (Main - slim - airflow image, CI image with more dependencies, Wheel cache image for efficient caching of PIP dependencies).
Basic assumptions
- There are two images:
- "Airflow" image - slim image with only necessary Airflow dependencies
- "CI" image - fat image with additional dependencies necessary for CI tests
- there are separate images for each python version (currently 2.7, 3.5, 3.6)
- each image uses python-x.y-slim as a base
- all stages are defined in single multi-stage Dockerfile
- it's possible to build main airflow image by issuing "docker build ." command. It's not optimised for DockerHub cache reuse but it will build locally.
- hook/build script can build the image utilising DockerHub cache - pulling the images from registry and using as cache
- binary/apt dependencies are build as separate stages - so that we can use whole cached images with main/CI dependencies as cache source
- the builds are versioned - airflow 2.0.0.dev0 images are different than airflow 2.0.1dev0
Changes that trigger rebuilds
Those changes below are described starting from the most frequent ones - so staring backwards from the end of Dockerfile, going up to the beginning.
- apt and pip dependencies: they are "upgraded" as last part of the build (after sources are added) - thus upgrade to latest versions available is triggered every time sources change (utilising cache from previous installations).
- source changes do not invalidate previously installed packages from apt/pip/npm. They trigger upgrades to pip/apt package as explained above.
- changing to www sources trigger pre-compiling the web page for production (npm run prod) and everything above.
- changing package.json or package-lock.json trigger reinstallation of all npm packages (npm ci) and everything above.
- changing any of setup.py-related files trigger reinstallation of all pip packages. In case of CI build, previously compiled wheel packages from wheel image are used to install the dependencies (saving time for downloading and compilation of packages) and everything above.
- changing the wheel cache causes everything above
- for CI build, changing CI apt dependencies triggers reinstallation of those dependencies and everything above
- changing Airflow apt dependencies triggers reinstallation of those dependencies and everything above
- there is a possibility to trigger whole build process by changing one line in Dockerfile (FORCE_REINSTALL_ALL_DEPENDENCIES)
- new python stable image triggers rebuild of the whole image
Stages of the image
Those are the stages of the image that we have defined in Dockerfile
- X.Y - python version (2.7, 3.5 or 3.6 currently)
- VERSION - airflow version (v2.0.0.dev0)
No. | Stage | Description | Labels in DockerHub | Airflow build deps | CI build deps |
---|---|---|---|---|---|
1 | Python | Base python image | python-X.Y-slim | - | - |
2 | ariflow-apt-deps | Vital Airflow apt dependencies | latest-X.Y-apt-deps-VERSION | 1 | 1 |
3 | airflow-ci-apt-deps | Additional CI image dependencies | latest-X.Y-ci-apt-deps-VERSION | [Not used] | 2 |
4 | wheel-cache-previous | Previously build wheel cache for faster PIP installs | latest-X.Y-wheelcache-VERSION | [Not used] | 3 |
5 | wheel-cache | Currently build wheel cache (for future builds) | latest-X.Y-wheelcache-VERSION | [Not used] | 3 |
6 | main | Main airflow sources build. Used for both Airflow and CI build | Airflow builds:
CI builds:
| 2 | 2 - image 4 - /cache folder with wheels |
Dependencies between stages
Effectively those images we create have those dependencies:
TODO:
- Layers in main build
- Different build types
Conclusions
- The multi-layered image is only slightly bigger than the mono-layered one (976 MB mono-layered, 1007 MB multi-layered, around 2% more in total ) - download time is also slightly longer by 1 s (33.7 vs 32.7s) which is 3% longer.
- Downloading the image regularly by the users is way better in case of multi-layered image - for simulated user, downloading airflow image twice a week it is: 4950 MB (multi-layered) vs. 13546 MB (mono-layered) downloads over the course of 8 weeks. Yielding 64% less data to download.
- Multi-layered image seems to be much better for users regularly downloading the image.
- TODO: