Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Status

Page properties


State
Draft
Completed
Discussion ThreadAIP-7 Simplified development workflow
Jira
JIRA

Jira
serverASF JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId5aa69414-a9e9-3523-82ec-879b028fb15b
keyAIRFLOW-3611

Created

Created




Motivation

Currently, the workflow for submitting new contributions is very cumbersome. Just the process of running unit tests is a difficult process, be it that you want to run them locally or through creating your own TravisCI pipeline. Airflow tests depend on many external services and other custom setup, which makes it hard for contributors and committers to work on this codebase. CI builds have also been unreliable, and it is hard to reproduce the causes. Having contributors trying to emulate the build environment every time makes it easier to get to an "it works on my machine" sort of situation.

The goal of this proposal is to outline the work needed to make local testing significantly easier and standardise the best practices to contribute to the Airflow project.


What problem does it solve?

  • It's difficult to reproduce unit and integration tests locally
  • Documentation on running the integration tests is not comprehensive and does not make it easy to run tests

Considerations

Requirements / Constraints

  • TravisCI unit tests should be reproducible locally
  • Integration tests local reproducibility is optional for now to keep this simple
  • Extensive documentation is required
  • Integration tests are difficult to run in a local environment, given their intrinsic coupling of some of them with cloud services. See AIP-4 Support for Automation of System Tests for external systems[Deps: AIP-47] for an example of this.

  • The current environment setup on TravisCI (the setup before the tests are run) takes around 4 minutes. Maybe some things can be optimised  by reducing the docker image size. But it seems that right now, the docker image size is hard to reduce, given we pre-install Hadoop, Hive and MiniCluster.

Proposed changes to the workflow / infrastructure

  •  Creation of a separate incubator-airflow-ci repo, where a CI/dev base image with all dependencies is built. This has been done already in apache/incubator-airflow-ci
 
  • Different python versions and backend configuration are not easy to reproduce locally
  • Separate incubator-airflow-ci project is setup with image that is maintained independently from the main Docker image requiring double maintenance. Those images are fairly different.
  • The image needs to be rebuilt manually periodically

Why is this change needed?

  • Being able to reproduce Travis CI environment should help in solving stability issues and decrease time/money used by Travis CI builds
  • It will be easier to onboard new contributors and committers if the environment is easy to setup and use

Already completed (also part of AIP-10 Multi-layered and multi-stage official Airflow CI image )

configuration.
  • This simplifies the setup of services like MySQL, PostgreSQL, OpenLDAP, krb5 and rabbitmq which are needed for both running Airflow and running some unit and integration tests. 
  • The same setup should allow us to add further service dependencies as needs arise
    • configuration (tick).
    • Building the image on DockerHub in incremental way (tick).
    • Using the image in CI environment using DockerHub registry as cache to allow incremental rebuilds (tick)
    • Running a regular cron job with no cache (to test building from the scratch)(tick)
    • Building and incremental rebuilding of the image for local development (tick)

    It provides these features 

    • Tox is removed (tick)
    • Build script in CI docker image (tick)
    • Single source of truth for Airflow and CI airflow image (tick)
    • Ability to build/pull/push image automatically (tick)
    • Ability to use different DockerHub user for local development (tick)
    • Changes to sources/PIP packages/Apt packages can be atomic PRs in the same repository (tick)
    • Sizes of images are optimised similar to described to now deprecated Optimizing Docker Image Workflow (tick)

    The implementation is based on work done by Polidea (provided by Jarek Potiuk ) scripts that address a few of this pain points https://github.com/PolideaInternal/airflow-breeze-gcp-extension

    The features of the workflow/environment are described in the documentation in detail.

    Here a summary of what's available currently:

    • automated pulling of necessary images (tick)
    • automated rebuild of images when needed (tick)
    • mounting local sources to within local container (tick)
    • easy start of all the dependencies and airflow container (tick)
    • initialising of local virtualenv (for local IDE to work) (tick)
    • choosing which Python version and Backend to use (tick)
    • easy running of commands and tests inside the docker (tick)
    • auto-complete for test names with run-tests command (tick)
    • initialise-database-only-once when running run-tests scripts (tick)
    • catchy name ("breeze") and logo (tick).
    • video showing basic features and usefulness of Breeze (tick)
    • extensive documentation (tick)

    Video describing Breeze:

    Widget Connector
    urlhttp://youtube.com/watch?v=ffKFHV6f3PQ

    Possible spin-off changes (deferred to later PR):

    The initial work has been submitted and merged already in apache/incubator-airflow/pull/3393

    •  MiniCluster should be moved to it's own image and orchestrated through the docker-compose setup
    •  Strip out Tox and fully rely on our docker setup
    •  Bake build script in CI docker image
    •  Current image sizes should be reduced to the bare minimum for speed (Optimizing Docker Image Workflow)
    •  Current Kubernetes CI scripts should be run on GKE instead via minikube (Kubernetes Testing: Using GKE instead of Minikube)
    •  Create a developer guide

    References

    Jarek Potiuk has kindly shared a repo created by his company (polidea.com) with scripts that address a few of this pain points https://github.com/PolideaInternal/airflow-breeze