This file should be processed by PythonOperator in the DAG. The use-case we are going to cover in this article involves a three-step process. Airflow UIĪfter successful execution, the message is printed on the logs: Logs A Use-Case for DAGs We are going to execute all our DAGs on GCP Cloud Composer. Once the execution is complete, we should see the message “First DAG executed Successfully” in the logs. This operator is used to execute any callable Python function. ![]() The task in the DAG is to print a message in the logs. The schedule_interval is configured as which indicates that the DAG will run every hour. The line with DAG is the DAG which is a data pipeline that has basic parameters like dag_id, start_date, and schedule_interval. The first step is to import the necessary modules required for DAG development. With DAG(dag_id="FirstDAG", start_date=datetime(2022,1,23), as dag: Print("First DAG executed Successfully!!") from airflow import DAGįrom _operator import PythonOperator The example DAG we are going to create consists of only one operator (the Python operator) which executes a Python function.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |