Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Status

Page properties


StateDraft
Discussion ThreadAIP-7 Simplified development workflow
JIRA

Jira
serverASF JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId5aa69414-a9e9-3523-82ec-879b028fb15b
keyAIRFLOW-3611

Created

Created

Draft PR (open for discussion) of proposed implementation is here: https://github.com/apache/airflow/pull/4932

TODO: Update description based on the discussion.




Motivation

Currently, the workflow for submitting new contributions is very cumbersome. Just the process of running unit tests is a difficult process, be it that you want to run them locally or through creating your own TravisCI pipeline. Airflow tests depend on many external services and other custom setup, which makes it hard for contributors and committers to work on this codebase. CI builds have also been unreliable, and it is hard to reproduce the causes. Having contributors trying to emulate the build environment every time makes it easier to get to an "it works on my machine" sort of situation.

The goal of this proposal is to outline the work needed to make local testing significantly easier and standardise the best practices to contribute to the Airflow project.


What problem does it solve?

  • It's difficult to reproduce unit and integration tests locallyd
  • Documentation on running the integration tests is not comprehensive and does not make it easy to run tests

Considerations

Requirements / Constraints

  • TravisCI unit tests should be reproducible locally
  • Integration tests local reproducibility is optional for now to keep this simple
  • Extensive documentation is required
  • Integration tests are difficult to run in a local environment, given their intrinsic coupling of some of them with cloud services. See AIP-4 Support for System Tests for external systems for an example of this.

  • The current environment setup on TravisCI (the setup before the tests are run) takes around 4 minutes. Maybe some things can be optimised  by reducing the docker image size. But it seems that right now, the docker image size is hard to reduce, given we pre-install Hadoop, Hive and MiniCluster.

Proposed changes to the workflow / infrastructure

  •  Creation of a separate incubator-airflow-ci repo, where a CI/dev base image with all dependencies is built. This has been done already in apache/incubator-airflow-ci
 
  • Different python versions and backend configuration are not easy to reproduce locally
  • Separate incubator-airflow-ci project is setup with image that is maintained independently from the main Docker image requiring double maintenance. Those images are fairly different.
  • The image needs to be rebuilt manually periodically

Why is this change needed?

  • Being able to reproduce Travis CI environment should help in solving stability issues and decrease time/money used by Travis CI builds
  • It will be easier to onboard new contributors and committers if the environment is easy to setup and use

Already completed

  • Docker image in https://github.com/apache/airflow-ci is setup. Dockerised CI pipeline is used to run the tests
  • Setting up docker-compose forcontainer orchestration and configuration.
    • This simplifies the setup of services like MySQL, PostgreSQL, OpenLDAP, krb5 and rabbitmq which are needed for both running Airflow and running some unit and integration tests. 
    • The same setup should allow us to add further service dependencies as needs arise
    • The initial work
has been submitted and
  •  MiniCluster should be moved to it's own image and orchestrated through the docker-compose setup
  •  Strip out Tox and fully rely on our docker setup
  •  Bake build script in CI docker image
  •  Create a developer guide

References

Jarek Potiuk has kindly shared a repo created by his company (polidea.com) with

Related work in progress

There is a work in progress on AIP-10 Multi-layered and multi-stage official Airflow image to unify the official "Main" Airflow Docker Image and  CI Docker image. The image introduced there is multi-staging, and multi-layered and optimised for different scenarios:

  • Building the image on DockerHub in incremental way 
  • Using the image in CI environment using DockerHub registry as cache to allow incremental rebuilds
  • Running a regular cron job with no cache (to test building from the scratch)
  • Building and incremental rebuilding of the image for local development

It provides these features 

  • Tox is removed 
  • Build script in CI docker image
  • Single source of truth for Airflow and CI airflow image
  • Ability to build/pull/push image automatically
  • Ability to use different DockerHub user for local development
  • Changes to sources/PIP packages/Apt packages can be atomic PRs in the same repository
  • Sizes of images are optimised similar to described to now deprecated Optimizing Docker Image Workflow

Once the work is done, we can use the Image to provide fast local development environment to simplify the workflow. This change heavily depend on it.

Suggested implementation

The suggested implementation based on work done by Polidea (led by Jarek Potiuk ) scripts that address a few of this pain points https://github.com/PolideaInternal/airflow-breeze

Draft PR (open for discussion) of proposed implementation is here: https://github.com/apache/airflow/pull/4932. The features of the workflow/environment are described in the documentation in detail. Here a summary of what's provided currently:

  • automated pulling of necessary images
  • automated rebuild of images when needed
  • mounting local sources to within local container
  • easy start of all the dependencies and airflow container
  • initialising of local virtualenv (for IDE work) 
  • choosing which Python version and Backend to use
  • easy running of commands and tests inside the docker
  • catchy name ("breeze") and logo

Possible Future changes (deferred to later PR):