Skip to content

astronomer/2-9-example-dags

Repository files navigation

Data Science DAGs with Apache Airflow 2.9

This repository contains example DAGs showing features released in Apache Airflow 2.9.

Aside from core Apache Airflow this project uses:

For pinned versions of the provider packages see the requirements.txt file.

Note

You can find new Airflow 2.9 features in the DAG code by searching for # NEW in Airflow 2.9:.

How to use this repository

This section explains how to run this repository with Airflow.

Note

Some DAGs in this repository require additional connections or tools. You can define these connection in the Airflow UI under Admin > Connections or by using the .env file with the format shown in .env.example. The load_to_snowflake DAG requires some additional setup in Snowflake, see the DAG docstring for more information. DAGs with the tag toy work without any additional connections or tools.

See the Manage Connections in Apache Airflow guide for further instructions on Airflow connections.

Steps to run this repository

Download the Astro CLI to run Airflow locally in Docker. astro is the only package you will need to install.

  1. Run git clone https://github.com/astronomer/2-9-example-dags.git on your computer to create a local clone of this repository.
  2. Install the Astro CLI by following the steps in the Astro CLI documentation. Docker Desktop/Docker Engine is a prerequisite, but you don't need in-depth Docker knowledge to run Airflow with the Astro CLI.
  3. Run astro dev start in your cloned repository.
  4. After your Astro project has started. View the Airflow UI at localhost:8080.

DAGs

The following sections list the DAGs shown sorted by the feature that they showcase. You can filter DAGs in the UI by their tags.

data-enineering-use-case

The DAGs in the data-engineering-use-case folder showcase a data engineering use case using AWS and Snowflake with several Airflow 2.9 features implemented throughout the DAGs.

Dataset toys

The DAGs in the toy_conditional_dataset_scheduling folder show new ways to use dataset scheduling without needing any additional connections or tools.

Other

Useful links

Project Structure

This repository contains the following files and folders:

  • .astro: files necessary for Astro CLI commands.
  • dags: all DAGs in your Airflow environment. Files in this folder will be parsed by the Airflow scheduler when looking for DAGs to add to your environment. You can add your own dagfiles in this folder.
  • include: supporting files that will be included in the Airflow environment. Among other files contains the code for the listener plugin in include/listeners.py.
  • plugins: folder to place Airflow plugins. Contains a listener plugin.
  • tests: folder to place pytests running on DAGs in the Airflow instance. Contains default tests.
  • .astro-registry.yaml: file to configure DAGs being uploaded to the Astronomer registry. Can be ignored for local development.
  • .dockerignore: list of files to ignore for Docker.
  • .env.example: example environment variables for the DAGs in this repository. Copy this file to .env and replace the values with your own credentials.
  • .gitignore: list of files to ignore for git.
  • Dockerfile: the Dockerfile using the Astro CLI. Sets environment variables to change Airflow webserver settings.
  • packages.txt: system-level packages to be installed in the Airflow environment upon building of the Docker image. Empty.
  • README.md: this Readme.
  • requirements.txt: python packages to be installed to be used by DAGs upon building of the Docker image.

About

Example DAGs for Airflow 2.9

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published