Application Workflow Jobs

The following topics describe job attributes that work with application workflow platforms and services:

Airflow Job

Airflow enables you to monitor and manage DAG workflows in Control-M. You can monitor DAG executions in the Airflow tab in the Monitoring domain. You can also view the specific details of each task, open the DAG in the Airflow web server user interface, and view XCom variables from the Airflow tab.

To create an Airflow job, see Creating a Job. For more information about this plug-in, see Control-M for Airflow.

The following table describes the Airflow job type attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to Airflow, as described in Airflow Connection Profile Parameters .

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

DAG ID

Defines the unique identifier of a DAG.

  • Airflow returns 100 records by default. To increase this limit, you must change the maximum_page_limit parameter in the airflow.cfg file.

  • The Airflow plug-in requests 300 records to be returned by default.

    To increase this number, you must add or modify the NumberOfRecordsPerRequest parameter in the <Agent_Home>/cm/AFL/data/cm_container_conf.xml file, <NumberOfRecordsPerRequest>400</NumberOfRecordsPerRequest>.

  • You can filter the returned list of DAG IDs by typing one of the following:

    • A partial string such as abcd, which returns all DAG IDs that contain this string.

    • A partial string followed by an asterisk, such as abcd*, which returns all DAG IDs that start with this string.

Configuration JSON

(Optional) Defines the JSON object, which describes additional configuration parameters.

Output Details

Determines whether to include Airflow DAG task logs in the Control-M job output, as follows:

  • No task logs.

  • Include failing task logs.

  • Include all task logs.

You can also view all task logs from the Airflow tab in an Airflow job in the Monitoring domain. In addition, you can view the log of each Airflow task execution represented by the Try Number field.

If a task executed 3 times during the DAG run, the Try Number shows three options.

Apache Airflow Job

Apache Airflow enables you to create, schedule, and monitor complex data processing and analytics pipelines. It provides an environment to define, manage, and execute workflows as Directed Acyclic Graphs (DAGs) to control task dependencies and execution order.

To create an Apache Airflow job, see Creating a Job. For more information about this plug-in, see Control-M for Apache Airflow..

The following table describes the Apache Airflow job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to Apache Airflow, as described in Apache Airflow Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces

Action

Determines whether to run a new DAG or rerun a DAG.

Valid Values:

  • Run DAG

  • Rerun DAG

DAG Name Defines the logical name of the DAG.
DAG Run ID

(Optional) Defines the specific DAG run (execution) ID in Airflow to track and manage individual workflow executions.

If you do not provide a DAG Run ID, the system generates a random Run ID.

Parameters

Defines the parameters for the Apache Airflow job, in JSON format, which enables you to control how the job executes.

Use backslashes to escape quotes.

Copy
{\"parameter1\":\"value1\"}

If you are not adding parameters, type {}.

Only Failed Tasks

Determines whether to rerun a DAG only with failed tasks or all tasks.

Valid Values:

  • true

  • false

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 60

Failure Tolerance

Determines the number of times the job tries to run before ending Not OK.

Default: 2

Apache NiFi Job

Apache NiFi is an open-source tool that automates data flow across systems in real time.

To create an Apache NiFi job, see Creating a Job. For more information about this plug-in, see Control-M for Apache NiFi.

The following table describes the Apache NiFi job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to Apache NiFi, as described in Apache NiFi Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Processor Group ID Defines the ID number of a specific processor group.
Processor ID Defines the ID number of a specific processor.
Action

Determines one of the following actions to perform on Apache NiFi:

  • Run Processor

  • Stop Processor

  • Update Processor

  • Run Processor Once

Disconnected Node Ack Determines whether to disconnect the node to allow mutable requests to proceed.
Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 5

Failure Tolerance

Determines the number of times the job tries to run before ending Not OK.

Default: 0

Astronomer Job

Astronomer is a workload automation service based on Apache Airflow that enables you to create, schedule, and manage your workflows.

To create an Astronomer job, see Creating a Job. For more information about this plug-in, see Control-M for Astronomer.

The following table describes the Astronomer job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to Astronomer, as described in Astronomer Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Action

Determines whether to run a new DAG or rerun a DAG.

Valid Values:

  • Run DAG

  • Rerun DAG

DAG Name

Defines the logical name of the Directed Acyclic Graph (DAG), as defined in the Airflow interface.

DAG Run ID

(Optional) Defines the specific DAG run (execution) ID in Airflow.

Parameters

Defines the JSON-based body parameters to pass when the DAG executes, in the following format:

Copy
"Variable1":"Value1","Variable2":"Value2"

Only Failed Tasks

Determines whether to rerun a DAG only with failed tasks or all tasks.

Valid Values:

  • true

  • false

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 60

Failure Tolerance

Determines the number of times the job tries to run before ending Not OK.

Default: 3

AWS MWAA Job

AWS Managed Workflows for Apache Airflow (MWAA) is an orchestration service built on Apache Airflow, designed to create, schedule, and monitor data pipelines and workflows.

To create an AWS MWAA job, see Creating a Job. For more information about this plug-in, see Control-M for AWS MWAA.

The following table describes the AWS MWAA job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to AWS MWAA, as described in AWS MWAA Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces

Action

Determines whether to run a new DAG or rerun a DAG.

Valid Values:

  • Run DAG

  • Rerun DAG

MWAA Environment Name

Defines the logical name of the MWAA environment.

DAG Name Defines the logical name of the Directed Acyclic Graph (DAG).
DAG Run ID

(Optional) Defines the unique identifier for a specific DAG run in an orchestration system.

The ID helps in track and manage individual workflow executions.

If you do not provide a DAG Run ID, the system generates a random Run ID.

Parameters

Defines the parameters for the AWS MWAA job, in JSON format, which enables you to control how the job executes.

Use backslashes to escape quotes.

Copy
{\"parameter1\":\"value1\"}

If you are not adding parameters, type {}.

Only Failed Tasks

Determines whether to rerun a DAG only with failed tasks or all tasks.

Valid Values:

  • true

  • false

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 60

Failure Tolerance

Determines the number of times the job tries to run before ending Not OK.

Default: 3

AWS Step Functions Job

AWS Step Functions enables you to create visual workflows that can integrate other AWS services.

To create an AWS Step Functions job, see Creating a Job. For more information about this plug-in, see Control-M for AWS Step Functions.

The following table describes the AWS Step Functions job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to AWS Step Functions, as described in AWS Step Functions Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Execution Name

Defines the name of the Step Function execution.
An execution executes a state machine, which is a workflow.

State Machine ARN

Determines the Step Function state machine to use.

A state machine is a workflow, and an Amazon Resource Name (ARN) is a standardized AWS resource address.

arn:aws:states:us-east-1:155535555553:stateMachine:MyStateMachine

Parameters

Defines the parameters for the Step Function job, in JSON format, which enables you to control how the job executes.

Use backslashes to escape quotes.

Copy
{\"parameter1\":\"value1\"}

If you are not adding parameters, type {}.

Show Execution Logs

Determines whether to append the log to the outputClosed A tab in the job properties pane of the Monitoring domain where the job output appears that indicates whether a job ended OK, and is used, for example, with jobs that check file location..

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 20

Failure Tolerance

Determines the number of times to check the job status before ending Not OK.

Default: 2

Azure Logic Apps Job

Azure Logic Apps enables you to design and automate cloud-based workflows and integrations.

To create an Azure Logic Apps job, see Creating a Job. For more information about this plug-in, see Control-M for Azure Logic Apps.

The following table describes the Azure Logic Apps job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to Azure Logic Apps, as described in Azure Logic Apps Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Workflow

Determines which of the Consumption logic app workflows executes from your predefined set of workflows.

This job does not execute Standard logic app workflows.

Parameters

Defines parameters, in JSON format, that enable you to control the presentation of data.

Copy

  "param1":"value1",
  "param2":"value2"
}

Rules:

  • Characters: 2–4,000

  • For no parameters, type {}.

Get Logs

Determines whether to display the job output when the job ends.

Failure Tolerance

Determines the number of times to check the job status before ending Not OK.

Default: 2

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 20

GCP Composer Job

Google Cloud (GCP) Composer is a managed workflow orchestration service built on Apache Airflow that enables you to automate workflow tasks.

To create a GCP Composer job, see Creating a Job. For more information about this plug-in, see Control-M for GCP Composer.

The following table describes GCP Composer job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to GCP Composer, as described in Application Workflow Connection Profiles.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Action

Determines whether to run a new DAG or rerun an existing DAG.

Valid Values:

  • Run DAG

  • Rerun DAG

DAG Name

Defines the DAG logical name as defined in the GCP interface.

DAG Run ID

(Optional) Defines the specific DAG run (execution) ID in GCP Composer.

Parameters JSON Input

Defines the JSON-based body parameters to pass when the DAG executes, in the following format:

Copy
"Variable1":"Value1","Variable2":"Value2"

Only Failed Tasks

Determines whether to rerun a DAG only with failed tasks or all tasks.

Valid Values:

  • true

  • false

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 60

Failure Tolerance

Determines the number of times to check the job status before ending Not OK.

Default: 2

GCP Workflows Job

GCP Workflows enables you to design and automate cloud-based workflows and integrations.

To create a GCP Workflows job, see Creating a Job. For more information about this plug-in, see Control-M for GCP Workflows.

The following table describes GCP Workflows job attributes.

Attribute

Description

Connection Profile

Determines the authorization credentials that are used to connect Control-M to GCP Workflows, as described in GCP Workflows Connection Profile Parameters.

Rules:

  • Characters: 1−30

  • Case Sensitive: Yes

  • Invalid Characters: Blank spaces.

Project ID

Defines the GCP project ID where the batch job executes.

A project is a set of configuration settings that define the resources your GCP Workflows jobs use and how they interact with GCP.

Location

Defines the region where the GCP Workflow job executes.

us-central1

Workflow Name

Determines the predefined GCP Workflow that executes.

Parameters

Defines the JSON-based body parameters that are passed to the function, in the following format:

Copy
{\"var1\":\"value1\",\"var2\":\"value2\"}

Execution Label

Defines a job execution label, which enables you to group similar executions in the GCP Workflows log.

{"labelName": "name"}

Show Workflow Results

Determines whether the GCP Workflow results appear in the job output.

Status Polling Frequency

Determines the number of seconds to wait before checking the job status.

Default: 20

Failure Tolerance

Determines the number of times to check the job status before ending Not OK.

Default: 3