I want to run Airflow dags and watch the logs in the terminal.

Trouble is, each time a task is run a new directory and file is created. Something like:


This makes it hard to tail-follow the logs. Thankfully, starting from Airflow 1.9, logging can be configured easily, allowing you to put all of a dag’s logs into one file.


  1. If you make this change, you won’t be able to view task logs in the web UI, because the UI expects log filenames to be in the normal format.

  2. Logging to a single file is useful for development (using the SequentialExecutor), but it’s not recommended in production because issues will arise when multiple tasks attempt to write to the same log file at once.

Easy Solution

Requires Airflow 1.10+

Set the FILENAME_TEMPLATE setting.

export AIRFLOW__CORE__LOG_FILENAME_TEMPLATE="{{ ti.dag_id }}.log"

Requires Airflow 1.9+

Since Airflow 1.9, logging is configured pythonically.

Grab Airflow’s default log config,, and copy it somewhere in your PYTHONPATH.

curl -O

Set the logging_config_class setting. (Make sure this is set in both your scheduler and worker’s environments). (Alternatively set the related setting in airflow.cfg.)


Now you can configure logging to your liking.

Edit, changing FILENAME_TEMPLATE to:

FILENAME_TEMPLATE = '{{ ti.dag_id }}.log'

You should now get all of a dag log output in a single file.

Tailing the logs

Start the scheduler and trigger a dag.

$ airflow scheduler
$ airflow trigger_dag my-dag

Watch the output with tail -f.

$ tail -f ~/airflow/logs/my-dag.log