We have experienced issue when using MySQL 5.7 for Airflow DB. As it works for us, we merged the branch to master and will use it.
Exposes dag and task based metrics from Airflow to a Prometheus compatible endpoint.
Current version is compatible with Airflow 2.0+
Version v1.3.2 is compatible
Note: Airflow 1.10.14 with Python 3.8 users
You should install importlib-metadata package in order for plugin to be loaded. See #85 for details.
Version v0.5.4 is compatible
pip install airflow-exporterThat's it. You're done.
It is possible to add extra labels to DAG-related metrics by providing labels dict to DAG params.
dag = DAG(
'dummy_dag',
schedule_interval=timedelta(hours=5),
default_args=default_args,
catchup=False,
params={
'labels': {
'env': 'test'
}
}
)Label env with value test will be added to all metrics related to dummy_dag:
airflow_dag_status{dag_id="dummy_dag",env="test",owner="owner",status="running"} 12.0
Metrics will be available at
http://<your_airflow_host_and_port>/admin/metrics/
Labels:
dag_idtask_idownerstatus
Value: number of tasks in a specific status.
Labels:
dag_idownerstatus
Value: number of dags in a specific status.
Labels:
dag_id: unique identifier for a given DAG
Value: duration in seconds of the longest DAG Run for given DAG. This metric is not available for DAGs that have already finished.
Labels:
dag_idownerstatus
Value: 0 or 1 depending on wherever the current state of each dag_id is status.
Distributed under the BSD license. See LICENSE for more information.