top of page

Download free e-books

Explore the world of Software and Data Engineering in a more efficient and accessible way with our eBooks!

  • Writer's pictureJP

Tutorial : Apache Airflow for beginners



Airflow

Intro


Airflow has been one of the main orchestration tools on the market and much talked about in the Modern Data Stack world, as it is a tool capable of orchestrating data workloads through ETLs or ELTs. But in fact, Airflow is not just about that, it can be applied in several cases of day-to-day use of a Data or Software Engineer.


In this Apache Airflow for Beginners Tutorial, we will introduce Airflow in the simplest way, without the need to know or create ETLs.


But what is Airflow actually?


Apache Airflow is a widely used workflow orchestration platform for scheduling, monitoring, and managing data pipelines. It has several components that work together to provide its functionalities.

 

Airflow components


DAG


The DAG (Directed Acyclic Graph) is the main component and workflow representation in Airflow. It is composed of tasks (tasks) and dependencies between them. Tasks are defined as operators (operators), such as PythonOperator, BashOperator, SQLOperator and others. The DAG defines the task execution order and dependency relationships.


Webserver


The Webserver component provides a web interface for interacting with Airflow. It allows you to view, manage and monitor your workflows, tasks, DAGs and logs. The Webserver also allows user authentication and role-based access control.


Scheduler


The Scheduler is responsible for scheduling the execution of tasks according to the DAG definition. It periodically checks for pending tasks to run and allocates available resources to perform the tasks at the appropriate time. The Scheduler also handles crash recovery and scheduling task retries.


Executor


The Executor is responsible for executing the tasks defined in the DAGs. There are different types of executors available in Airflow such as LocalExecutor, CeleryExecutor, KubernetesExecutor and etc. Each executor has its own settings and execution behaviors.


Metadatabase


Metadatabase is a database where Airflow stores metadata about tasks, DAGs, executions, schedules, among others. It is used to track the status of tasks, record execution history, and provide information for workflow monitoring and visualization. It is possible to use several other databases to record the history such as MySQL, Postgres and among others.


Workers


Workers are the execution nodes in a distributed environment. They receive tasks assigned by the Scheduler and execute them. Workers can be scaled horizontally to handle larger data pipelines or to spread the workload across multiple resources.



Plugins


Plugins are Airflow extensions that allow you to add new features and functionality to the system. They can include new operators, hooks, sensors, connections to external systems, and more. Plugins provide a way to customize and extend Airflow's capabilities to meet the specific needs of a workflow.

 

Operators


Operators are basically the composition of a DAG. Understand an operator as a block of code with its own responsibility. Because Airflow is an orchestrator and executes a workflow, we can have different tasks to be performed, such as accessing an API, sending an email, accessing a table in a database and performing an operation, executing a Python code or even a Bash command.


For each of the above tasks, we must use an operator. Next, we will discuss some of the main operators:


BashOperator


BashOperator allows you to run Bash commands or scripts directly on the operating system where Airflow is running. It is useful for tasks that involve running shell scripts, utilities, or any action that can be performed in the terminal. In short, when we need to open our system's terminal and execute some command to manipulate files or something related to the system itself, but within a DAG, this is the operator to be used.


PythonOperator


The PythonOperator allows you to run Python functions as tasks in Airflow. You can write your own custom Python functions and use the PythonOperator to call those functions as part of your workflow.


DummyOperator


The DummyOperator is a "dummy" task that takes no action. It is useful for creating complex dependencies and workflows without having to perform any real action.


Sensor


Sensors are used to wait for some external event to occur before continuing the workflow, it can work as a listener. For example, the HttpSensor, which is a type of Sensor, can validate if an external API is active, if so, the flow continues to run. It's not an HTTP operator that should return something, but a type of listener.


HttpOperator


Unlike a Sensor, the HttpOperator is used to perform HTTP requests such as GET, POST, PUT, DELETE end etc. In this case, it allows you to interact more fully with internal or external APIs.


SqlOperator


SqlOperator is the operator responsible for performing DML and DDL operations in a database, that is, from data manipulations such as SELECTS, INSERTS, UPDATES and so on.


 

Executors


Executors are responsible for executing the tasks defined in a workflow (DAG). They manage the allocation and execution of tasks at runtime, ensuring that each task runs efficiently and reliably. Airflow offers different types of executors, each with different characteristics and functionalities, allowing you to choose the most suitable one for your specific needs. Below, we’ll cover some of the top performers:


LocalExecutor


LocalExecutor is the default executor in Apache Airflow. It is designed to be used in development and test environments where scalability isn't a concern. LocalExecutor runs tasks on separate threads within the same Airflow process. This approach is simple and efficient for smaller pipelines or single-node runs.


CeleryExecutor


If you need an executor for distributed and high-scale environments, CeleryExecutor is an excellent choice. It uses Celery, a queued task library, to distribute tasks across separate execution nodes. This approach makes Airflow well-suited for running pipelines on clusters of servers, allowing you to scale horizontally on demand.


KubernetesExecutor


For environments that use Kubernetes as their container orchestration platform, KubernetesExecutor is a natural choice. It leverages Kubernetes' orchestration capability to run tasks in separate pods, which can result in better resource isolation and easier task execution in containers.


DaskExecutor


If your workflow requires parallel and distributed processing, DaskExecutor might be the right choice. It uses the Dask library to perform parallel computing on a cluster of resources. This approach is ideal for tasks that can be divided into independent sub-tasks, allowing better use of available resources.

 

Programming language


Airflow supports Python as programming language. To be honest, it's not a limiter for those who don't know the language well. In practice, the process of creating DAGs is standard, which can change according to your needs, it will deal with different types of operators, whether or not you can use Python.

 

Hands-on


Setting up the environment


For this tutorial we will use Docker that will help us provision our environment without the need to install Airflow.


If you don't have Docker installed, I recommend following the recommendations in this link and after installing it, come back to follow the tutorial.


Downloading project


To make it easier, clone the project from the following repository and follow the steps to deploy Airflow.


Steps to deploy


With docker installed and after downloading the project according to the previous item, access the directory where the project is located and open the terminal, run the following docker command:


docker-compose up

The above command will start the docker containers where the services of Airflow itself, postgres and more.

If you're curious about how these services are mapped, open the project's docker-compose.yaml file and there you'll find more details.


Anyway, after executing the above command and the containers already started, access the following address via browser http://localhost:8080/


A screen like below will open, just type airflow for the username and password and access the Airflow UI.

Airflow Login

 

Creating a DAG


Creating a simple Hello World


For this tutorial, we will create a simple DAG where the classic "Hello World" will be printed. In the project you downloaded, go to the /dags folder and create the following python file called hello_world.py.



The code above is a simple example of a DAG written in Python. We noticed that we started import some functions, including the DAG itself, functions related to the datetime and the Python operator.


Next, we create a Python function that will print to the console "Hello World" called by print_hello function. This function will be called by the DAG later on.


The declaration of a DAG starts using the following syntax with DAG(..) passing some arguments like:


  • dag_id: DAG identifier in Airflow context


  • start_date: The defined date is only a point of reference and not necessarily the date of the beginning of the execution nor of the creation of the DAG. Usually the executions are carried out at a later date than the one defined in this parameter, and it is important when we need to calculate executions between the beginning and the one defined in the schedule_interval parameter.


  • schedule_interval: In this parameter we define the periodicity in which the DAG will be executed. It is possible to define different forms of executions through CRON expressions or through Strings already defined as @daily, @hourly, @once, @weekly and etc. In the case of the example, the flow will run only once.


  • catchup: This parameter controls retroactive executions, that is, if set to True, Airflow will execute the retroactive period from the date defined in start_date until the current date. In the previous example we defined it as False because there is no need for retroactive execution.


After filling in the arguments, we create the hello_task within the DAG itself using the PythonOperator operator, which provides ways to execute python functions within a DAG.


Note that we declared an identifier through the task_id and in the python_callable argument, which is native to the PythonOperator operator, we passed the python print_hello function created earlier.


Finally, invoke the hello_task. This way, the DAG will understand that this will be the task to be performed.


If you have already deployed it, the DAG will appear in Airflow in a short time to be executed as shown in the image below:

Airflow Tasks
hello_operator Task

After the DAG is created, activate and execute it by clicking on Trigger DAG as shown in the image above. Click on the hello_operator task (center) and then a window will open as shown in the image below:


Airflow Task Instance
hello_operator logs

Click the Log button to see more execution details:

Airflow Logs
hello_operator logs

Note how is simple it to create a DAG, just think about the different possibilities and applicability scenarios. For the next tutorials, we'll do more examples that are a bit more complex by exploring several other scenarios.

 

Conclusion


Based on the simple example shown, Airflow presented a flexible and simple approach to controlling automated flows, from creating DAGs to navigating your web component. As I mentioned at the beginning, its use is not limited only to the orchestration of ETLs, but also to the possibility of its use in tasks that require any need to control flows that have dependencies between their components within a context. scalable or not.


GitHub Repository


Hope you enjoyed!












bottom of page