Status
State: Draft
Discussion thread: https://lists.apache.org/thread.html/7af1a4faa4baa119a124cec0920c2d6e4b7b6c91d7fa5b7ce0d1c1d6@%3Cdev.airflow.apache.org%3E
JIRA: AIRFLOW-3718
Table of Contents |
---|
Motivation
Current official Airflow image is rebuilt from the scratch every time new commit is done to the repo. It is a "mono-layered" one and does not use Docker's multi-layer architecture nor multi-stage Docker architecture.
Mono-layered image means that builds after only small changes take as long as full build rather than utilise caching and only rebuild what's needed.
With multi-layered approach and caching enabled in Docker Hub we can optimise it to download only the layers that changed. This enables the users using the images to download only incremental changes, and opens up a number of options how such incremental build/download process can be utilised:
- Multi-layered images can be used as based for AIP-7 Simplified development workflow - where locally downloaded images are used during development and they are incrementally updated quickly during development with newly added dependencies.
- Multi-layered images being part of the "airflow" project can be used to run Travis CI integration tests (simplifying the idea described in Optimizing Docker Image Workflow ). Having incremental builds will allow DockerHub registry to be used as source for base images (pulled before build) to build locally final image used for test execution in an incremental way.
- Why initially the images are not meant to be used in production, using multi-staging, variable arguments and multiple layers to produce production-ready Airflow image that can be used to pre-bake Dags into the image - thus making Airflow closer to be Kubernetes-native. This has been discussed as potential future improvement in AIP-12 Persist DAG into DB
- Ideally both Airflow and CI images should be maintained in single place - "source of truth" to ease maintenance and development. Currently they are maintained in separate repositories and have potentially different dependencies and build process. It also makes it difficult to add your own dependencies during development as there is no regular/development friendly process to update CI image with new dependencies.
Considerations
In the PR : https://github.com/apache/airflow/pull/4543 the current mono-layered docker has been implemented as multi-layered one. The PR uses "hooks/build" hook that is used by DockerHub build process to control caching and build process. Thanks to that we can build different variants of the images (Main - slim - airflow image, CI image with more dependencies, Wheel cache image for efficient caching of PIP dependencies).
Basic assumptions
- There are two images:
- "Airflow" image - slim image with only necessary Airflow dependencies
- "CI" image - fat image with additional dependencies necessary for CI tests
- there are separate images for each python version (currently 2.7, 3.5, 3.6)
- each image uses python-x.y-slim as a base
- all stages are defined in single multi-stage Dockerfile
- it's possible to build main airflow image by issuing "docker build ." command. It's not optimised for DockerHub cache reuse but it will build locally.
- hook/build script can build the image utilising DockerHub cache - pulling the images from registry and using as cache
- binary/apt dependencies are build as separate stages - so that we can use whole cached images with main/CI dependencies as cache source
- the builds are versioned - airflow 2.0.0.dev0 images are different than airflow 2.0.1dev0
Changes that trigger rebuilds
Those changes below are described starting from the most frequent ones - so staring backwards from the end of Dockerfile, going up to the beginning.
- apt and pip dependencies: they are "upgraded" as last part of the build (after sources are added) - thus upgrade to latest versions available is triggered every time sources change (utilising cache from previous installations).
- source changes do not invalidate previously installed packages from apt/pip/npm. They trigger upgrades to pip/apt package as explained above.
- changing to www sources trigger pre-compiling the web page for production (npm run prod) and everything above.
- changing package.json or package-lock.json trigger reinstallation of all npm packages (npm ci) and everything above.
- changing any of setup.py-related files trigger reinstallation of all pip packages. In case of CI build, previously compiled wheel packages from wheel image are used to install the dependencies (saving time for downloading and compilation of packages) and everything above.
- changing the wheel cache causes everything above
- for CI build, changing CI apt dependencies triggers reinstallation of those dependencies and everything above
- changing Airflow apt dependencies triggers reinstallation of those dependencies and everything above
- there is a possibility to trigger whole build process by changing one line in Dockerfile (FORCE_REINSTALL_ALL_DEPENDENCIES)
- new python stable image triggers rebuild of the whole image
Stages of the image
Those are the stages of the image that we have defined in Dockerfile
- X.Y - python version (2.7, 3.5 or 3.6 currently)
- VERSION - airflow version (v2.0.0.dev0)
No. | Stage | Description | Labels in DockerHub | Airflow build depsdependencies | CI build depsdependencies |
---|---|---|---|---|---|
1 | Python | Base python image | python-X.Y-slim | - | - |
2 | ariflow-apt-deps | Vital Airflow apt dependencies | latest-X.Y-apt-deps-VERSION | 1 | 1 |
3 | airflow-ci-apt-deps | Additional CI image dependencies | latest-X.Y-ci-apt-deps-VERSION | [Not used] | 2 |
4 | wheel-cache-previousmaster | Previously build Master wheel cache build on DockerHub from latest master for faster PIP installs | latest-X.Y-wheelcache-VERSION | [Not used] | 3 |
5 | wheel-cache | Currently build wheel cache (for future builds) | latest-X.Y-wheelcache-VERSION | [Not used] | 3 |
6 | main | Main airflow sources build. Used for both Airflow and CI build | Airflow builds:
CI builds:
| 2 | 2 - image 4 - /cache folder with wheels |
Dependencies between stages
Effectively those images we create have those dependencies:. In case of Dockerfile changes, Docker multi-staging mechanism takes care about rebuilding only those stages that need to be rebuild in case of Dockerfile definition change - changes in a stage trigger rebuilds only in stages that depend on it.
draw.io Diagram | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
Layers in the main
buildimage
The main image has a number of layers, that make the image rebuilds incrementally depending on changes in the repository vs. the previous build. Mechanism of Docker build (context/cache invalidation) are used to determine if the subsequent layers should be invalidated and rebuild.
No. | Layer | Description | Trigger for rebuild | Airflow build behaviour | CI build behaviour |
---|---|---|---|---|---|
1 | Wheel cache master | /cache folder with cached wheels from previous build | Rebuild of the wheelcache source. | Empty wheel cache used to minimise size of the image | Wheel cache build in latest DockerHub "master" image used. |
2 | PIP configuration | Setup.py and related files (version.py etc.) | Updated dependencies for PIP | Copy setup.py related files to context | Copy setup.py related files to context |
3 | PIP install | PIP installation | Previous layer change | All PIP dependencies downloaded and installed | PIP dependencies installed from wheel cache - new dependencies downloaded and installed |
4 | NPM package configuration | package.json and package-lock.son | Updated dependencies for NPM | Copy package files to context | Copy package files to context |
5 | npm ci | Installs locked dependencies from NPM | Previous layer change | All NPM dependencies downloaded and installed | All NPM dependencies downloaded and installed |
6 | www files | airflow/www all files | Updated any of the www files | Copy www files to context | Copy www files to context |
7 | npm run prod | Prepares production javascript packaging for webserver | Previous layer change | Javascript prepared | Packages prepared |
8 | airflow sources | Copy all sources to context | Any change in sources | Copy sources to context | Copy sources to context |
9 | apt-get upgrade | Upgrading apt dependencies | Previous layer change | All apt packages upgraded to latest stable versions | All apt packages upgraded to latest stable versions |
10 | pip install | Reinstalling PIP dependencies | Previous layer change | Pip packages are potentially upgraded | All PIP packages are upgraded |
The results of such layer structure are the following behaviours:
- in case wheel image is changed: PIP packages + NPM packages + NPM compile + sources are reinstalled for CI build (nothing changes for Airflow build)
- in case PIP configuration is changed: PIP packages + NPM packages + NPM compile + sources are reinstalled. For Airflow build, all PIP packages are downloaded and installed, for CI build Wheel cache is used as base for installation (faster)
- in case NPM configuration is changed: NPM packages + NPM compile + sources are reinstalled
- in case any of WWW files changed: NPM compile + sources are reinstalled
- in case of any source change: sources are reinstalled
Different types of builds
The images for Airflow are build for several scenarios - and the "hook/build" script with accompanying environment variable controls which images are built during those scenarios:
Scenario | Trigger | Purpose | Cache | Frequency | Pull from DockerHub | Push to DockerHub | Images prepared during the build (controled by environment variables) | |||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Apt deps | CI Apt deps | Master Wheelcache | Local wheelcache | Airflow | CI | |||||||
DockerHub build for master branch | A commit merged to "master" | Build and push reference images that are used as cache for subsequent builds | From master | Several times per day | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Local developer build | Triggered by the user | Build when developer adds dependencies or downloads new code and prepares development environment | From local images (pulled initially) unless cache is disabled | Once per day | First time or when requested | When requested and user logged in | Yes | Yes | Yes | Yes | ||
CI build | A commit is pushed to any branch | Builds image that is used to execute CI tests for commits pushed by developers. | From master | Several times an hour | Yes | No | Yes | Yes | Yes |
Timings for different scenarios
Those timings were measured during tests. This includes image pull - full pull for CI builds and incremental pulls for
Where built | No source change | Sources changed | WWW sources changed | NPM packages changed | PIP Packages changed | Full rebuild | Docker build . (clean cache) |
---|---|---|---|---|---|---|---|
DockerHub (Airflow +CI) | - | ||||||
Travis CI (CI) | - | ||||||
Cloud Build * (CI) | - | ||||||
Google Compute Engine ** | |||||||
Local Machine *** (CI) - pull images | |||||||
Local Machine *** (CI) - images pulled |
* Cloud Build - M8 High CPU - 3 Python versions built in parallel on single instance
** Google Compute Engine: custom (8 vCPUs, 31 GB memory)
*** Local Machine: MacBook Pro (15-inch, 2017), 2,9 GHz Intel Core i7, 4 Cores
Appendixes
Results for initial measurements of sizes of layer images is shown. It has proven that multi-layered image size is comparable to mono-layered one and that there are significant download traffic savings in case of incremental builds.
Expand | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Details for Mono-layered Docker image for AirflowImplemented in https://github.com/apache/airflow/commit/e2c22fe70a488feea0cfecde890c20f8c984c09c Available to pull at: docker pull potiuk/airflow-monodocker:latest Only significant layers are shown:
Total: 976 MB Example download time when tested (full download after removing the image and docker system prune): 32.7 s (note this was not scientific enough and can be influenced by external factors) time docker pull potiuk/airflow-monodocker:latest real 0m32.744s Details for Multi-layered Docker image of AirflowPOC implemented in https://github.com/apache/airflow/pull/4543 Available to pull at: docker pull potiuk/airflow-layereddocker:latest Only significant layers are shown:
Total: 1007 MB Example download time when tested (full download after removing the image and docker system prune): 33.7 s (note this was not scientific enough and can be influenced by external factors) time docker pull potiuk/airflow-layereddocker:latest real 0m33.761s Note that ariflow sources + reinstall will grow between force - reinstalling of all dependencies because upgrades of packages will be added. However this should not be significant. If full reinstall is done periodically, the size of this layer is reset. It turns out that multi layered image is even a bit smaller than the monolayered one. But those are not all benefits that you get from multi-layered image. If you take into account usage patterns and users who download the image semi-frequently they will have to download the whole single layer pretty much every time, where in multi-layered approach they would only need to pull incremental changes - the size of incremental changes will change depending on whether setup.py dependencies are updated, or whether all dependencies are forced to be rebuilt from scratch. Simulation of downloads for a user that pulls the image regularlyHere is the simulation showing how big downloads users will experience when downloading Airflow image semi-frequently (twice a week). Assumptions:
Mono layered downloads:
Multi-layered downloads:
User download size pattern:
|
Expand | ||
---|---|---|
| ||
Sources for calculationMono-layered image: docker history potiuk/airflow-monodocker:latest IMAGE CREATED CREATED BY SIZE COMMENT Multi-layered image: docker history potiuk/airflow-layereddocker:latest IMAGE CREATED CREATED BY SIZE COMMENT |
Timings for different scenarios
Those timings were measured during tests:
Where built | |||
---|---|---|---|
DockerHub | |||
Travis CI | |||
Cloud Build * | |||
Local Machine ** |
* Cloud Build - M8 High CPU - 3 Python versions built in parallel
** Local Machine:
M8 High CPU
(3 Python versions
in paralle.l
Appendixes
Results for initial measurements of sizes of layer images is shown. It has proven that multi-layered image size is comparable to mono-layered one and that there are significant download traffic savings in case of incremental builds.
Expand | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Details for Mono-layered Docker image for AirflowImplemented in https://github.com/apache/airflow/commit/e2c22fe70a488feea0cfecde890c20f8c984c09c Available to pull at: docker pull potiuk/airflow-monodocker:latest Only significant layers are shown:
Total: 976 MB Example download time when tested (full download after removing the image and docker system prune): 32.7 s (note this was not scientific enough and can be influenced by external factors) time docker pull potiuk/airflow-monodocker:latest real 0m32.744s Details for Multi-layered Docker image of AirflowPOC implemented in https://github.com/apache/airflow/pull/4543 Available to pull at: docker pull potiuk/airflow-layereddocker:latest Only significant layers are shown:
Total: 1007 MB Example download time when tested (full download after removing the image and docker system prune): 33.7 s (note this was not scientific enough and can be influenced by external factors) time docker pull potiuk/airflow-layereddocker:latest real 0m33.761s Note that ariflow sources + reinstall will grow between force - reinstalling of all dependencies because upgrades of packages will be added. However this should not be significant. If full reinstall is done periodically, the size of this layer is reset. It turns out that multi layered image is even a bit smaller than the monolayered one. But those are not all benefits that you get from multi-layered image. If you take into account usage patterns and users who download the image semi-frequently they will have to download the whole single layer pretty much every time, where in multi-layered approach they would only need to pull incremental changes - the size of incremental changes will change depending on whether setup.py dependencies are updated, or whether all dependencies are forced to be rebuilt from scratch. Simulation of downloads for a user that pulls the image regularlyHere is the simulation showing how big downloads users will experience when downloading Airflow image semi-frequently (twice a week). Assumptions:
Mono layered downloads:
Multi-layered downloads:
User download size pattern:
|
Expand | ||
---|---|---|
| ||
Sources for calculationMono-layered image: docker history potiuk/airflow-monodocker:latest IMAGE CREATED CREATED BY SIZE COMMENT Multi-layered image: docker history potiuk/airflow-layereddocker:latest IMAGE CREATED CREATED BY SIZE COMMENT |
Conclusions
- The multi-layered image is only slightly bigger than the mono-layered one (976 MB mono-layered, 1007 MB multi-layered, around 2% more in total ) - download time is also slightly longer by 1 s (33.7 vs 32.7s) which is 3% longer.
- Downloading the image regularly by the users is way better in case of multi-layered image - for simulated user, downloading airflow image twice a week it is: 4950 MB (multi-layered) vs. 13546 MB (mono-layered) downloads over the course of 8 weeks. Yielding 64% less data to download.
- Multi-layered image seems to be much better for users regularly downloading the image.
- TODO:
Appendixes
Expand | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||
Expand | |||||||||||||||||
| |||||||||||||||||
Details for Mono-layered Docker image for AirflowImplemented in https://github.com/apache/airflow/commit/e2c22fe70a488feea0cfecde890c20f8c984c09c Available to pull at: docker pull potiuk/airflow-monodocker:latest Only significant layers are shown: | |||||||||||||||||
Layer | Size | When rebuilt/downloaded | |||||||||||||||
python:3.6-slim layers (there are 12 layers) | 138 MB | Only the first time it is built | |||||||||||||||
Airflow Sources | 73 MB | After every commit | |||||||||||||||
Airflow installed binaries (all - apt and pip installed together) | 765 MB | After every commit | |||||||||||||||
Layer | Size | When rebuilt/downloaded | |||||||||||||||
python:3.6-slim layers (there are 12 layers) | 138 MB | Only the first time it is built | |||||||||||||||
apt-get install core build deps | 118 MB | Only when core dependencies change or when we force fresh build (extremely rare) | |||||||||||||||
apt-get install extra deps | 155MB | Only when extra deps change (extremely rare) | |||||||||||||||
pip install deps (just setup no airflow sources) | 523 MB | Only when setup.py changes (every few weeks usually) | |||||||||||||||
copy airflow sources | 73 MB | After every commit | |||||||||||||||
Install extra airflow deps just in case | 6 MB | After every commit | |||||||||||||||
Weeks | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | Total downloaded over the | Sources change | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x | Setup.py changes | x | x | x | x | Forced dependencies | x | x |
Monolayered (MB) | 976 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 838 | 13546 | Multilayered (MB) | 1007 | 73 | 73 | 73 | 757 | 73 | 73 | 757 | 869 | 73 | 73 | 73 | 757 | 73 | 73 | 73 | 4950 (36% of monolayered)