Friday, 1 November 2019


Apache Airflow Check Previous Run Status for a DAG

Apache Airflow Check Previous Run Status for a DAG

The Problem

We encountered a scenario where if any previous DAG run fails the next scheduled DAG run should not proceed. Checked for the solution and did not find a robust solution. So I created one working with Shahed Munir

The Solution

I came up with a very generic approach that can be applied to any DAG that is created within Apache Airflow. Here is a sample DAG that uses and applies this concept in practice. 

To enable the capability of checking any of the previous DAG run (self references) the first task needs to be created on the lines provided in the sample DAG within GitHub

This solution relies on a Airflow Connection a mysql database used to store the Airflow Metadata.
In the sample DAG code this Airflow connection is called deliverbi_mysql_airflow

Then place the sample DAG in the dags directory.


All previous DAG runs must be set to success for the current DAG run to proceed. This task should be the first in any DAG that is created to enable this functionality

Here is the link to GitHub Repo Click Here to go to GitHub for DeliverBI

Over and Out. Krishna

About Authors

Shahed Munir

Krishna Udathu

Shahed and Krishna are Oracle / Big Data Experts