Airflow Orchestration and Deployment
This course provides a deep dive into data engineering with Airflow. We’ve designed this course to cover the fundamental concepts of production data pipelines, best practices using the latest features up to Airflow 2.5, and a survey of open source libraries built on top of Airflow that simplify analytics workflow development, including those orchestrated with dbt.
This course is meant for data, backend, or analytics engineers who have some familiarity with Airflow, but would like to learn more about how to best utilize it to deliver repeated stakeholder value without blowing up data team spend. After this course, you will have the skills needed to build dependable scheduled data pipelines, as well as an understanding of the libraries and frameworks that can help accelerate the development and augment the capabilities of these systems.
Course taught by expert instructors

Henry Weller
Data Engineer at MongoDB
Henry is an experienced system builder focused on designing, debugging, and optimizing data workflows. He is currently a Data Engineer at MongoDB building systems that are used by thousands of data subscribers. He has also held previous technical roles at Kevala, where he built a modern data stack orchestrated in Airflow including Spark, BigQuery, and Google Data Studio, and Osaro, where he wrote control software for autonomous robotic systems.

Julian LaNeve
Product @ Astronomer
Julian is on the product team at Astronomer, one of the driving forces behind Apache Airflow. He currently leads all developer experience-related initiatives, both commercial and open source. He's also authored and contributed to a number of open source projects, including Airflow itself, Cosmos (an Airflow dbt integration), Airflow providers for Databricks and DuckDB, and Ruff.
The course
Learn and apply skills with real-world projects.
Software engineers and data scientists new to the field of data engineering who are interested in learning how to build modern data pipelines.
Data engineers looking to level up their ETL skills, and adopt a general framework for their data use cases.
Ability to write Python and work with documented libraries
Familiarity with web applications, Docker basics, and the command line
Nice to have: Familiarity with Kubernetes, working within cloud compute environments such as AWS/GCS
Try these prep courses first
- Learn
- DAGs
- Operators, Tasks and Task Groups
- Testing and debugging Airflow code
- Intro to Taskflow API
- Astro CLI
- Airflow Architecture
- Google Cloud Storage
Project- Create a basic deployment of a cluster and your first useful DAG for processing timeseries energy data using the Taskflow API.
- Learn
- Sensors
- XComs
- Custom XCom Backends
- Dynamic Task Mapping
Project- Build an energy price prediction ML pipeline and use XComs and dynamic task mapping to select the best model and save it to GCS.
- Learn
- Hooks and Dedicated Operators
- Data Warehouse Transformations
- Large Volume Workflows
- Control Plane vs Data Plane
- BigQuery
Project- Use dedicated BigQuery Operators to create external tables and views from data, perform SQL transformations and joins across data sources.
- Learn
- Astro SDK
- Data Warehouse Transformations Revisited
- Linking Pipelines Together Using Datasets
- Making Airflow more accessible to data analysts and SQL experts
Project- Write an Astro SDK DAG that tightly couples Airflow tasks with data assets using File and Table abstractions.
A course you'll actually complete. AI-powered learning that drives results.
AI-powered learning
Transform your learning programs with personalized learning. Real-time feedback, hints at just the right moment, and the support for learners when they need it, driving 15x engagement.
Live courses by leading experts
Our instructors are renowned experts in AI, data, engineering, product, and business. Deep dive through always-current live sessions and round-the-clock support.
Practice on the cutting edge
Accelerate your learning with projects that mirror the work done at industry-leading tech companies. Put your skills to the test and start applying them today.
Flexible schedule for busy professionals
We know you’re busy, so we made it flexible. Attend live events or review the materials at your own pace. Our course team and global community will support you every step of the way.
Completion certificates
Each course comes with a certificate for learners to add to their resume.
Best-in-class outcomes
15-20x engagement compared to async courses
Support & accountability
You are never alone, we provide support throughout the course.
Get reimbursed by your company
More than half of learners get their Courses and Memberships reimbursed by their company.
Hundreds of companies have dedicated L&D and education budgets that have covered the costs.
Course success stories
Learn together and share experiences with other industry professionals
As a product management professional with 20+ years of experience, I was nervous and excited to enroll in a Masters level course. Henry, a consummate teacher, played a tremendous role in helping me navigate a tough curriculum and be successful in my endeavor to learn. On many occasions, Henry helped clarify foundational elements needed for the course, reviewed specific lecture material, taught concepts from supplementary reading material suggested in the course and even helped me learn other reference material that would reinforce the learning.
Henry uses Airflow to implement data pipeline that handle large volumes of data ingestion, he shared his wealth of practical Airflow knowledge with colleagues to help them build modern data pipeline that are future proof. Henry is very knowledgeable on leveraging different Airflow operators that help improve productivity and performance.
The Uplimit Platform is on that would challenge you to learn and become a better version of yourself. With the maximum commitment by tutors, TA's and the community. Data Professionals and Engineers taking courses on Uplimit are already set up for success. Highly recommended.