Over three days, this course immerses participants in Apache Airflow's architecture and configuration, guiding them through setting up environments, choosing executors, and developing robust DAGs with Python. Through hands-on exercises—ranging from dynamic task mapping and templating to cloud integrations and custom plugin development—attendees will master best practices for automating, monitoring, and optimizing production-ready workflows.
This course provides a comprehensive introduction to Apache Airflow, covering its architecture, configuration, and workflow automation capabilities. Participants will learn how to set up and manage Airflow environments, configure executors, and develop DAGs using Python. The course explores essential components like tasks, operators, variables, and connections, as well as advanced topics such as dynamic DAGs, templating, and custom plugins. Hands-on exercises include running DAGs, scheduling tasks, integrating cloud providers, and monitoring workflows through logs and the Airflow UI. By the end of the course, participants will be equipped to build, automate, and optimize data pipelines using Airflow.
21 hours of intensive training with live instruction delivered over three to five days to accommodate varied scheduling needs.
Students receive comprehensive courseware, including reference documents, code samples, and lab guides.