OnSeptember 15, 2026, all Cloud Composer 1 versions and versions 2.0.x of Cloud Composer 2 willreach their planned end of life. You will not be able to use environments with these versions. We recommend planningmigration to Cloud Composer 3. Cloud Composer 2 versions 2.1.x and later are still supported and are not impacted by this change.
This page describes how you can group tasks in your Airflow pipelines
using the following design patterns:
Grouping tasks in the DAG graph.
Triggering children DAGs from a parent DAG.
Grouping tasks with theTaskGroupoperator.
Group tasks in the DAG graph
To group tasks in certain phases of your pipeline, you can use relationships
between the tasks in your DAG file.
Consider the following example:
Figure 1.Tasks can be grouped together in an Airflow
DAG (click to enlarge)
In this workflow, tasksop-1andop-2run together after the initial
taskstart. You can achieve this by grouping tasks together with the statementstart >> [task_1, task_2].
The following example provides a complete implementation of this DAG:
Figure 2.DAGs can be triggered from within a DAG with
the TriggerDagRunOperator (click to enlarge)
In this workflow, the blocksdag_1anddag_2represent a series of tasks
that are grouped together in a separate DAG in the Cloud Composer
environment.
The implementation of this workflow requires two separate DAG files.
The controlling DAG file looks like the following:
Airflow 2
fromairflowimportDAGfromairflow.operators.dummyimportDummyOperatorfromairflow.operators.trigger_dagrunimportTriggerDagRunOperatorfromairflow.utils.datesimportdays_agowithDAG(dag_id="controller_dag_to_trigger_other_dags",default_args={"owner":"airflow"},start_date=days_ago(1),schedule_interval="@once",)asdag:start=DummyOperator(task_id="start")trigger_1=TriggerDagRunOperator(task_id="dag_1",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)trigger_2=TriggerDagRunOperator(task_id="dag_2",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)some_other_task=DummyOperator(task_id="some-other-task")end=DummyOperator(task_id="end")start>>trigger_1>>some_other_task>>trigger_2>>end
Airflow 1
fromairflowimportDAGfromairflow.operators.dagrun_operatorimportTriggerDagRunOperatorfromairflow.operators.dummy_operatorimportDummyOperatorfromairflow.utils.datesimportdays_agowithDAG(dag_id="controller_dag_to_trigger_other_dags",default_args={"owner":"airflow"},start_date=days_ago(1),schedule_interval="@once",)asdag:start=DummyOperator(task_id="start")trigger_1=TriggerDagRunOperator(task_id="dag_1",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)trigger_2=TriggerDagRunOperator(task_id="dag_2",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)some_other_task=DummyOperator(task_id="some-other-task")end=DummyOperator(task_id="end")start>>trigger_1>>some_other_task>>trigger_2>>end
The implementation of the child DAG, which is triggered by the controlling
DAG, looks like the following:
You mustupload both DAG filesin your Cloud Composer environment for the DAG to work.
Grouping tasks with the TaskGroup operator
This approach works only in Airflow 2.
You can use theTaskGroupoperatorto group tasks
together in your DAG. Tasks defined within aTaskGroupblock are still part
of the main DAG.
Consider the following example:
Figure 3.Tasks can be visually grouped together in
the UI with the TaskGroup operator (click to enlarge)
The tasksop-1andop-2are grouped together in a block with IDtaskgroup_1. An implementation of this workflow looks like the following code:
fromairflow.models.dagimportDAGfromairflow.operators.bashimportBashOperatorfromairflow.operators.dummyimportDummyOperatorfromairflow.utils.datesimportdays_agofromairflow.utils.task_groupimportTaskGroupwithDAG(dag_id="taskgroup_example",start_date=days_ago(1))asdag:start=DummyOperator(task_id="start")withTaskGroup("taskgroup_1",tooltip="task group #1")assection_1:task_1=BashOperator(task_id="op-1",bash_command=":")task_2=BashOperator(task_id="op-2",bash_command=":")withTaskGroup("taskgroup_2",tooltip="task group #2")assection_2:task_3=BashOperator(task_id="op-3",bash_command=":")task_4=BashOperator(task_id="op-4",bash_command=":")some_other_task=DummyOperator(task_id="some-other-task")end=DummyOperator(task_id="end")start>>section_1>>some_other_task>>section_2>>end
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-05 UTC."],[[["\u003cp\u003eThis document provides methods for grouping tasks within Airflow pipelines, which helps organize and structure complex workflows.\u003c/p\u003e\n"],["\u003cp\u003eTasks can be grouped by defining relationships within the DAG graph, allowing certain tasks to run together sequentially or concurrently using operators such as '>>'.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eTriggerDagRunOperator\u003c/code\u003e allows one DAG to trigger other child DAGs, which is useful for modularizing parts of your pipeline into separate workflows.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eTaskGroup\u003c/code\u003e operator in Airflow 2 offers a way to visually group tasks in the UI and organize them within the same DAG, without the performance issues of SubDAGs.\u003c/p\u003e\n"],["\u003cp\u003eSubDAGs are deprecated, and it is recommended to use alternate approaches like TaskGroup, triggering children DAGs or relationships in the DAG graph to group tasks.\u003c/p\u003e\n"]]],[],null,["\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/group-tasks-inside-dags \"View this page for Cloud Composer 3\") \\| [Cloud Composer 2](/composer/docs/composer-2/group-tasks-inside-dags \"View this page for Cloud Composer 2\") \\| **Cloud Composer 1**\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page describes how you can group tasks in your Airflow pipelines\nusing the following design patterns:\n\n- Grouping tasks in the DAG graph.\n- Triggering children DAGs from a parent DAG.\n- Grouping tasks with the `TaskGroup` operator.\n\n| **Important:** Airflow provides [SubDAGs](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#subdags) to address repeating tasks. Despite being a common design pattern for grouping tasks together, SubDAGs often cause performance and functional issues, and is deprecated in Airflow. We recommend to **avoid using SubDAGs to group tasks together** in your workflow and prefer one of the alternative approaches described in this page.\n\nGroup tasks in the DAG graph\n\nTo group tasks in certain phases of your pipeline, you can use relationships\nbetween the tasks in your DAG file.\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-group-dags.png) **Figure 1.** Tasks can be grouped together in an Airflow DAG (click to enlarge)\n\nIn this workflow, tasks `op-1` and `op-2` run together after the initial\ntask `start`. You can achieve this by grouping tasks together with the statement\n`start \u003e\u003e [task_1, task_2]`.\n\nThe following example provides a complete implementation of this DAG:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n from airflow import DAG\n from airflow.operators.bash import BashOperator\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n\n DAG_NAME = \"all_tasks_in_one_dag\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"@once\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n start = DummyOperator(task_id=\"start\")\n\n task_1 = BashOperator(task_id=\"op-1\", bash_command=\":\", dag=dag)\n\n task_2 = BashOperator(task_id=\"op-2\", bash_command=\":\", dag=dag)\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n task_3 = BashOperator(task_id=\"op-3\", bash_command=\":\", dag=dag)\n\n task_4 = BashOperator(task_id=\"op-4\", bash_command=\":\", dag=dag)\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e [task_1, task_2] \u003e\u003e some_other_task \u003e\u003e [task_3, task_4] \u003e\u003e end\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n\n from airflow import DAG\n from airflow.operators.bash_operator import BashOperator\n from airflow.operators.dummy_operator import DummyOperator\n from airflow.utils.dates import days_ago\n\n DAG_NAME = \"all_tasks_in_one_dag\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"@once\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n start = DummyOperator(task_id=\"start\")\n\n task_1 = BashOperator(task_id=\"op-1\", bash_command=\":\", dag=dag)\n\n task_2 = BashOperator(task_id=\"op-2\", bash_command=\":\", dag=dag)\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n task_3 = BashOperator(task_id=\"op-3\", bash_command=\":\", dag=dag)\n\n task_4 = BashOperator(task_id=\"op-4\", bash_command=\":\", dag=dag)\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e [task_1, task_2] \u003e\u003e some_other_task \u003e\u003e [task_3, task_4] \u003e\u003e end\n\n\u003cbr /\u003e\n\nTrigger children DAGs from a parent DAG\n\nYou can trigger one DAG from another DAG with the\n[`TriggerDagRunOperator` operator](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/operators/trigger_dagrun/).\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-trigger-dags.png) **Figure 2.** DAGs can be triggered from within a DAG with the TriggerDagRunOperator (click to enlarge)\n\nIn this workflow, the blocks `dag_1` and `dag_2` represent a series of tasks\nthat are grouped together in a separate DAG in the Cloud Composer\nenvironment.\n\nThe implementation of this workflow requires two separate DAG files.\nThe controlling DAG file looks like the following:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n from airflow import DAG\n from airflow.operators.dummy import DummyOperator\n from airflow.operators.trigger_dagrun import TriggerDagRunOperator\n from airflow.utils.dates import days_ago\n\n\n with DAG(\n dag_id=\"controller_dag_to_trigger_other_dags\",\n default_args={\"owner\": \"airflow\"},\n start_date=days_ago(1),\n schedule_interval=\"@once\",\n ) as dag:\n start = DummyOperator(task_id=\"start\")\n\n trigger_1 = TriggerDagRunOperator(\n task_id=\"dag_1\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n trigger_2 = TriggerDagRunOperator(\n task_id=\"dag_2\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e trigger_1 \u003e\u003e some_other_task \u003e\u003e trigger_2 \u003e\u003e end\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n from airflow import DAG\n from airflow.operators.dagrun_operator import TriggerDagRunOperator\n from airflow.operators.dummy_operator import DummyOperator\n from airflow.utils.dates import days_ago\n\n\n with DAG(\n dag_id=\"controller_dag_to_trigger_other_dags\",\n default_args={\"owner\": \"airflow\"},\n start_date=days_ago(1),\n schedule_interval=\"@once\",\n ) as dag:\n start = DummyOperator(task_id=\"start\")\n\n trigger_1 = TriggerDagRunOperator(\n task_id=\"dag_1\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n trigger_2 = TriggerDagRunOperator(\n task_id=\"dag_2\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e trigger_1 \u003e\u003e some_other_task \u003e\u003e trigger_2 \u003e\u003e end\n\n\u003cbr /\u003e\n\n| **Note:** The value for `trigger_dag_id` inside `TriggerDagRunOperator` must match the `dag_id` value of the DAG you want to trigger.\n\nThe implementation of the child DAG, which is triggered by the controlling\nDAG, looks like the following:\n\n\u003cbr /\u003e\n\nAirflow 2\n\n\n from airflow import DAG\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n\n DAG_NAME = \"dag-to-trigger\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"None\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n dag_task = DummyOperator(task_id=\"dag-task\")\n\n\u003cbr /\u003e\n\nAirflow 1\n\n\n from airflow import DAG\n from airflow.operators.dummy_operator import DummyOperator\n from airflow.utils.dates import days_ago\n\n\n DAG_NAME = \"dag-to-trigger\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"None\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n dag_task = DummyOperator(task_id=\"dag-task\")\n\n\u003cbr /\u003e\n\nYou must [upload both DAG files](/composer/docs/composer-1/manage-dags#add)\nin your Cloud Composer environment for the DAG to work.\n\nGrouping tasks with the TaskGroup operator This approach works only in Airflow 2.\n\nYou can use the\n[`TaskGroup` operator](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#taskgroups) to group tasks\ntogether in your DAG. Tasks defined within a `TaskGroup` block are still part\nof the main DAG.\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-taskgroup-dag.png) **Figure 3.** Tasks can be visually grouped together in the UI with the TaskGroup operator (click to enlarge)\n\nThe tasks `op-1` and `op-2` are grouped together in a block with ID\n`taskgroup_1`. An implementation of this workflow looks like the following code: \n\n from airflow.models.dag import DAG\n from airflow.operators.bash import BashOperator\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n from airflow.utils.task_group import TaskGroup\n\n with DAG(dag_id=\"taskgroup_example\", start_date=days_ago(1)) as dag:\n start = DummyOperator(task_id=\"start\")\n\n with TaskGroup(\"taskgroup_1\", tooltip=\"task group #1\") as section_1:\n task_1 = BashOperator(task_id=\"op-1\", bash_command=\":\")\n task_2 = BashOperator(task_id=\"op-2\", bash_command=\":\")\n\n with TaskGroup(\"taskgroup_2\", tooltip=\"task group #2\") as section_2:\n task_3 = BashOperator(task_id=\"op-3\", bash_command=\":\")\n task_4 = BashOperator(task_id=\"op-4\", bash_command=\":\")\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e section_1 \u003e\u003e some_other_task \u003e\u003e section_2 \u003e\u003e end\n\nWhat's next\n\n- [Write DAGs](/composer/docs/composer-1/write-dags)\n- [Test DAGs](/composer/docs/composer-1/test-dags)"]]