Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.
When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.
Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.
Table of contents
- Requirements
- Getting started
- Installing from PyPI
- Official source code
- Project Focus
- Principles
- User Interface
- Backport packages
- Contributing
- Who uses Apache Airflow?
- Who Maintains Apache Airflow?
- Can I use the Apache Airflow logo in my presentation?
- Airflow merchandise
- Links
Apache Airflow is tested with:
Master version (2.0.0dev) | Stable version (1.10.12) | |
---|---|---|
Python | 3.6, 3.7, 3.8 | 2.7, 3.5, 3.6, 3.7, 3.8 |
PostgreSQL | 9.6, 10 | 9.6, 10 |
MySQL | 5.7 | 5.6, 5.7 |
SQLite | latest stable | latest stable |
Kubernetes | 1.16.2, 1.17.0 | 1.16.2, 1.17.0 |
Note: SQLite is used primarily for development purpose.
- Stable version requires at least Python 3.5.3 when using Python 3
Visit the official Airflow website documentation (latest stable release) for help with installing Airflow, getting started, or walking through a more complete tutorial.
Note: If you're looking for documentation for master branch (latest development branch): you can find it on ReadTheDocs.
For more information on Airflow's Roadmap or Airflow Improvement Proposals (AIPs), visit the Airflow Wiki.
Official Docker (container) images for Apache Airflow are described in IMAGES.rst.
Airflow is published as apache-airflow
package in PyPI. Installing it however might be sometimes tricky
because Airflow is a bit of both a library and application. Libraries usually keep their dependencies open and
applications usually pin them, but we should do neither and both at the same time. We decided to keep
our dependencies as open as possible (in setup.py
) so users can install different versions of libraries
if needed. This means that from time to time plain pip install apache-airflow
will not work or will
produce unusable Airflow installation.
In order to have repeatable installation, however, introduced in Airflow 1.10.10 and updated in
Airflow 1.10.12 we also keep a set of "known-to-be-working" constraint files in the
orphan constraints-master
and constraints-1-10
branches. We keep those "known-to-be-working"
constraints files separately per major/minor python version.
You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify
correct Airflow tag/version/branch and python versions in the URL.
- Installing just airflow:
pip install apache-airflow==1.10.12 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.7.txt"
- Installing with extras (for example postgres,google)
pip install apache-airflow[postgres,google]==1.10.12 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.7.txt"
Apache Airflow is an Apache Software Foundation (ASF) project, and our official source code releases:
- Follow the ASF Release Policy
- Can be downloaded from the ASF Distribution Directory
- Are cryptographically signed by the release manager
- Are officially voted on by the PMC members during the Release Approval Process
Following the ASF rules, the source packages released must be sufficient for a user to build and test the release provided they have access to the appropriate platform and tools.
Other ways of retrieving source code are "convenience" methods. For example:
- Tagging in GitHub to mark the git project sources that were used to generate official source packages
We also have binary "convenience" packages:
- PyPI releases to install Airflow using standard python tools
- Docker Images published in the Apache Airflow DockerHub.
These artifacts are not official releases, but they are built using officially released sources.
Note: Airflow Summit 2020's "Production Docker Image talk" explains context, architecture and customization/extension methods.
Airflow works best with workflows that are mostly static and slowly changing. When the structure is similar from one run to the next, it allows for clarity around unit of work and continuity. Other similar projects include Luigi, Oozie and Azkaban.
Airflow is commonly used to process data, but has the opinion that tasks should ideally be idempotent, and should not pass large quantities of data from one task to the next (though tasks can pass metadata using Airflow's Xcom feature). For high-volume, data-intensive tasks, a best practice is to delegate to external services that specialize on that type of work.
Airflow is not a streaming solution. Airflow is not in the Spark Streaming or Storm space.
- Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writing code that instantiates pipelines dynamically.
- Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
- Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built into the core of Airflow using the powerful Jinja templating engine.
- Scalable: Airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers.
-
DAGs: Overview of all DAGs in your environment.
-
Tree View: Tree representation of a DAG that spans across time.
-
Graph View: Visualization of a DAG's dependencies and their current status for a specific run.
-
Task Duration: Total time spent on different tasks over time.
-
Gantt View: Duration and overlap of a DAG.
-
Code View: Quick way to view source code of a DAG.
Currently, stable Apache Airflow versions are from the 1.10.* series. We are working on the future, major version of Airflow from the 2.0.* series. It is going to be released in 2020. However, the exact time of release depends on many factors and is not yet confirmed.
We have already a lot of changes in the operators, transfers, hooks, sensors, secrets for many external systems, but they are not used nor tested widely because they are part of the master/2.0 release.
In the Airflow 2.0 - following AIP-21 "change in import paths" all the non-core interfaces to external systems of Apache Airflow have been moved to the "airflow.providers" package.
Thanks to that and automated backport effort we took, the operators from Airflow 2.0 can be used in Airflow 1.10 as separately installable packages, with the constraint that those packages can only be used in python3.6+ environment.
We released backport packages that can be installed for older Airflow versions. Those backport packages are going to be released more frequently that main Airflow 1.10.* releases.
You will not have to upgrade your Airflow version to use those packages. You can find those packages in the PyPI and install them separately for each provider.
Those packages are available now and can be used in the latest Airflow 1.10.* version. Most of those packages are also installable and usable in most Airflow 1.10.* releases but there is no extensive testing done beyond the latest released version, so you might expect more problems in earlier Airflow versions.
With backported providers package users can migrate their DAGs to the new providers package incrementally and once they convert to the new operators/sensors/hooks they can seamlessly migrate their environments to Airflow 2.0. The nice thing about providers backport packages is that you can use both old and new classes at the same time - even in the same DAG. So your migration can be gradual and smooth. Note that in Airflow 2.0 old classes raise deprecation warning and redirect to the new classes wherever it is possible. In some rare cases the new operators will not be fully backwards compatible - you will find information about those cases in UPDATING.md where we explained all such cases. Switching early to the Airflow 2.0 operators while still running Airflow 1.10 will make your migration much easier.
More information about the status and releases of the back-ported packages are available at Backported providers package page
Note that the backport packages might require extra dependencies. Pip installs the required dependencies
automatically when it installs the backport package, but there are sometimes cross-dependencies between the
backport packages. For example google
package has cross-dependency with amazon
package to allow
transfers between those two cloud providers. You might need to install those packages in case you use
cross-dependent packages. The easiest way to install them is to use "extras" when installing the package,
for example the below will install both google
and amazon
backport packages:
pip install apache-airflow-backport-providers-google[amazon]
This is all documented in the PyPI description of the packages as well as in the README.md file available for each provider package. For example for google package you can find the readme in README.md. You will also find there the summary of both - new classes and moved classes as well as requirement information.
Backport providers only work when they are installed in the same namespace as the 'apache-airflow' 1.10
package. This is majority of cases when you simply run pip install
- it installs all packages
in the same folder (usually in /usr/local/lib/pythonX.Y/site-packages
). But when you install
the apache-airflow
and apache-airflow-backport-package-*
using different methods (for example using
pip install -e .
or pip install --user
they might be installed in different namespaces.
If that's the case, the provider packages will not be importable (the error in such case is
ModuleNotFoundError: No module named 'airflow.providers'
).
If you experience the problem, you can easily fix it by creating symbolic link
in your installed "airflow" folder to the "providers" folder where you installed your backport packages.
If you installed it with -e
, this link should be created in your airflow
sources, if you installed it with the --user
flag it should be from the
~/.local/lib/pythonX.Y/site-packages/airflow/
folder,
Want to help build Apache Airflow? Check out our contributing documentation.
More than 350 organizations are using Apache Airflow in the wild.
Airflow is the work of the community, but the core committers/maintainers are responsible for reviewing and merging PRs as well as steering conversation around new feature requests. If you would like to become a maintainer, please review the Apache Airflow committer requirements.
Yes! Be sure to abide by the Apache Foundation trademark policies and the Apache Airflow Brandbook. The most up to date logos are found in this repo and on the Apache Software Foundation website.
If you would love to have Apache Airflow stickers, t-shirt etc. then check out Redbubble Shop.