This intensive two-day course delivers a deep dive into Apache Airflow's architecture and core components—DAGs, operators, schedulers and executors—while contrasting it with Cron Jobs and Celery. Through hands-on labs in installation, Python/PostgreSQL and Kubernetes (EKS/Helm) deployment, custom container image building, and monitoring with logs and Grafana, participants will master the skills to configure, scale and optimize production-grade workflow automation solutions.
This course provides a deep dive into Apache Airflow, a powerful workflow automation platform for managing complex data pipelines. Participants will explore the architecture of Airflow, including Directed Acyclic Graphs (DAGs), operators, and schedulers. The course covers installation, configuration, and integration with Kubernetes, AWS EKS, and Helm. Attendees will gain hands-on experience deploying Airflow, optimizing workflows, customizing container images, and monitoring performance using logging and metrics. Designed for professionals, this course ensures participants can build scalable, reliable, and efficient workflow automation solutions.
14 hours of intensive training with live instruction delivered over three to five days to accommodate varied scheduling needs.
Students receive comprehensive courseware, including reference documents, code samples, and lab guides.