Job Types
The following table describes all the job types and links to topics that detail their job definition attributes.
Job Type |
Icon |
Description |
---|---|---|
|
Airbyte jobs are based on an open-source extract, transform, and load (ETL) service that enables you to build data pipelines and load data to a data warehouse, data lake, database, or an analytics tool of your choice. |
|
|
Airflow jobs enable you to monitor and manage DAG workflows within Control-M. |
|
|
Apache Airflow jobs enable you to monitor and manage DAG workflows in Control-M. |
|
|
Alteryx Trifacta jobs enable you to discover, organize, edit, and publish data in different formats and to multiple clouds, including AWS, Azure, Google, Snowflake, and Databricks. |
|
|
Apache NiFi is an open-source tool that automates data flow across systems in real time. |
|
|
Astronomer is a managed workflow orchestration service based on Apache Airflow that simplifies the process of creating, scheduling, and managing complex workflows. |
|
|
Automation Anywhere jobs perform robotic process automation (RPA) software, which enables you to create bots that observe human actions in the digital workplace, identify repetitive rules-based tasks, and automate these tasks. |
|
|
AWS App Runner jobs enable you to deploy containerized web applications and APIs directly from source code or container images, without the need to manage infrastructure. |
|
|
AWS jobs enable you to define and run AWS Lambda, AWS Step Functions, and AWS Batch services. |
|
|
AWS Athena jobs enable you to process, analyze, and store your data in the cloud. |
|
|
AWS Backup jobs enable you to back up and restore your data in the Amazon cloud. |
|
|
AWS Batch jobs enable you to manage and run batch computing workloads. |
|
|
AWS CloudFormation jobs enable you to create, configure, test, and manage all of your AWS services and resources. |
|
|
AWS Data Pipeline jobs enable you to automate the transfer, processing, and storage of your data. |
|
|
AWS DataSync enables you to move large amounts of data between on-premises storage and AWS storage services, as well as between AWS storage services. |
|
|
AWS DynamoDB is a NoSQL database service that enables you to create database tables, execute statements and transactions, export and import data to and from the Amazon S3 storage service. |
|
|
AWS EC2 jobs enable you to create virtual machines in the Amazon cloud-computing platform. |
|
|
AWS ECS jobs enable you to run, stop, manage, and monitor containerized applications in a cluster. |
|
|
AWS EMR jobs enable you to execute big data frameworks, such as Apache Hadoop and Apache Spark, to process and analyze vast amounts of data. |
|
|
AWS Glue jobs enable you to define data-driven workflows that automate the movement and transformation of data. |
|
|
AWS Glue DataBrew jobs enable you to visualize your data and publish it to the Amazon S3 Data Lake. |
|
|
AWS Lambda Jobs enable you to execute code on a virtual cluster. |
|
|
AWS Mainframe Modernization jobs enable you to migrate, manage, and run mainframe applications in the AWS cloud. |
|
|
AWS MWAA jobs enable you to create, schedule, and monitor data pipelines and workflows. |
|
|
AWS QuickSight jobs enable you visualize, analyze, and share large workloads of data. |
|
|
AWS Redshift jobs enable you to execute queries that run in Redshift, copy data from an S3 bucket, and unload data files into an S3 bucket. |
|
|
AWS SageMaker jobs enable you to create, train, and deploy machine learning models on premises, in the cloud, and on edge devices. |
|
AWS SNS Job |
AWS SNS (Simple Notification Service) is a cloud-based SNS that follows a Pub/Sub model and enables you to send notifications across various platforms and devices. |
|
|
AWS Step Functions jobs enable you to create visual workflows that can integrate other AWS services. |
|
AWS SQS Job |
AWS Simple Queue Service (SQS) is a message queuing service that enables you to exchange messages between components without losing messages. |
|
|
Azure Batch Accounts jobs enable you to efficiently execute large-scale, parallel, computer-intensive tasks in the cloud. |
|
|
Azure Backup jobs enable you to back up your data to, and restore it from, the Microsoft Azure cloud. |
|
Azure Container Instances Job | Azure Container Instances enables you to run an isolated container in Azure, without having to manage any virtual machines and without having to adopt a higher-level service. | |
|
Azure Data Factory jobs define data-driven workflows that automate the movement and transformation of data. |
|
|
Azure Databricks jobs enable you to process and analyze large workloads of data. |
|
|
Azure DevOps jobs enable you to build and deploy pipelines using DevOps lifecycle. |
|
|
Azure Functions jobs enable you to develop, test, and run applications in the cloud. |
|
|
Azure HDInsight jobs enable you to run an Apache Spark batch job for big data analytics. |
|
|
Azure Logic Apps jobs enable you to design and automate cloud-based workflows and integrations. |
|
|
Azure Machine Learning jobs enable you to build, train, deploy, and manage machine learning models on premises, in the cloud, and on edge devices. |
|
|
Azure Resource Manager jobs enable you to create, configure, test, and manage your Azure resources infrastructure. |
|
|
Azure Synapse jobs run Azure Synapse Analytics pipelines for data integration and big data analytics. |
|
|
Azure Virtual Machine (VM) jobs enable you to create, manage, and delete virtual machines in the Azure cloud. |
|
|
Boomi AtomSphere jobs enable you to develop, test, and run applications in the cloud. |
|
|
Communication Suite jobs enable you to automate business messaging and communication over Microsoft Teams, Slack, Telegram, and WhatsApp. |
|
|
Databases jobs define and monitor Stored Procedure, SQL Script, SQL Server Integration Services (SSIS) Package, and Embedded Query database jobs. |
|
|
Databricks jobs enable you to integrate jobs created in the Databricks environment with Control-M workflows. |
|
|
DBT jobs enable you to develop, test, schedule, document, and analyze data models. |
|
Dummy Job |
|
Dummy jobs are used as placeholders for managing and synchronizing job flows without any execution. |
|
File Transfer jobs enable you to watch and transfer files between hosts, or between Cloud storage buckets and containers. |
|
|
File Watcher jobs enable you to monitor file changes, such as creation, or deletion. |
|
|
GCP Batch jobs enable you to manage, schedule, and run batch computing workloads in the cloud. |
|
GCP BigQuery Job |
|
GCP BigQuery jobs enable you to process, analyze, and store your data in the cloud. |
|
GCP Cloud Run is a container management service that enables you to execute, stop, manage, and monitor containerized applications in a cluster. |
|
|
GCP Composer is a managed workflow orchestration service built on Apache Airflow that enables you to automate workflow tasks. |
|
|
GCP Data Fusion jobs enable you to load data from multiple sources, visualize it, and publish it to the cloud. |
|
|
GCP Dataflow jobs perform cloud-based data processing for batch and real-time data streaming applications. |
|
|
GCP Dataplex jobs enable you to load, visualize, and manage data in GCP BigQuery and the cloud. |
|
|
GCP Dataprep jobs enable you to visualize, format, and prepare your data for analysis. |
|
|
GCP Dataproc jobs perform cloud-based big-data processing and machine learning. |
|
|
GCP Deployment Manager jobs enable you to create, configure, test, and manage your GCP resources infrastructure. |
|
|
GCP Cloud Functions enables you to develop, test, and run applications in the cloud. |
|
|
GCP Virtual Machine (VM) jobs enable you to create, manage, and delete virtual machines on the Google Compute Engine (GCE). |
|
|
GCP Workflows enables you to design and automate cloud-based workflows and integrations. |
|
|
Hadoop jobs enable the distributed processing of large data sets across clusters of commodity servers. |
|
|
Informatica jobs enable you to automate Informatica tasks or workflows based on the parameters you define. |
|
|
Informatica Cloud Services (CS) jobs enable you to integrate and synchronize data, applications, and processes that reside on-premises or in the cloud. |
|
Jenkins enables you to automate building, testing, and deploying code for repetitive tasks in software deployment process. |
||
|
Kubernetes jobs enable you to run pods to completion in a Kubernetes-based cluster. |
|
|
Micro Focus jobs enable you to run Job Control Language (JCL) job stream files on mainframe environments on a Windows or UNIX operating system. |
|
|
Microsoft Power BI jobs enable you to run Power BI workflows for data visualization. |
|
|
Microsoft Power BI SP jobs enable you to run Power BI workflows for data visualization with service principal authentication. |
|
OCI Data Integration Job | OCI Data Integration job enables data extraction, transformation, and loading (ETL) processes across various sources and targets within the Oracle Cloud. | |
|
Oracle Cloud Infrastructure (OCI) Data Flow is a fully managed Apache Spark service that performs processing tasks on extremely large datasets. |
|
|
OCI Data Science is an Oracle Cloud Infrastructure (OCI) platform, that enables you to build, train, deploy, and manage machine learning (ML) models using Python and open source tools. |
|
|
The OCI Functions job enables you to develop, test, and run functions in the cloud. |
|
|
Oracle Cloud Infrastructure Virtual Machine (OCI VM) enables you to create, manage, and delete virtual machines in the Oracle cloud. |
|
|
OS jobs are used to execute a task on a specific distributed system. |
|
OS/400 Full jobs enable you to define any type of IBM i (AS/400) job: Program, Command, Multiple commands, Subsystem, External job, External subsystem, Virtual Terminal, or Restricted state. |
||
|
OS/400 Program jobs are a subset of the OS/400 Full job, which enables you to define and execute an IBM i (AS/400) native program in a library, S/38 program, or QShell program. |
|
|
OS/400 Multiple Commands jobs are a subset of the OS/400 Full job, which enables you to define and execute jobs with Multiple Commands. |
|
|
OS/400 VT jobs are a subset of the OS/400 Full job, which enables you to define and execute Virtual Terminal type jobs. |
|
|
OS/400 External jobs are a subset of the OS/400 Full job, which enables you to define and execute external jobs. |
|
PagerDuty jobs enable you to perform incident management and response in automated workflows and job scheduling. |
||
Qlik Cloud is an extract, transform, and load (ETL) service that enables you to visualize your data. |
||
RabbitMQ jobs enable message delivery and routing. |
||
|
SAP Business Warehouse jobs enable you to run pre-defined SAP Process Chains, or SAP Infopackages and monitor their completion status. |
|
Defines the SAP Data Archiving job type attributes. |
||
|
SAP R/3 jobs enable you to copy an existing SAP job or create a new SAP job. |
|
|
SLA Management jobs map out the critical path of a job workflow that needs to meet an SLA. |
|
|
Snowflake jobs integrate with the Snowflake cloud computing platform for data storage, processing, and analysis. |
|
|
Tableau jobs enable you to visualize, analyze, and share large workloads of data. |
|
|
Talend Data Management is an automation service that enables you to integrate applications, and extract, transform, load, and check the quality of large amounts of data. |
|
|
Talend OAuth (Open Authorization) enables you to use OAuth authentication within the Talend suite of data integration and management tools. It allows third-party applications to access resources on behalf of a user without sharing sensitive credentials. |
|
|
Terraform is an open-source Infrastructure as Code (IaC) tool that enables you to create, configure, test, and manage your infrastructure on multiple platforms in a declarative way. |
|
|
UiPath jobs perform robotic process automation (RPA) software, which enables you to create bots that observe human actions in the digital workplace, identify repetitive rules-based tasks, and automate these tasks. |
|
|
VMware By Broadcom jobs enable you to run multiple virtual machines on a single physical server. |
|
|
Web Services SOAP jobs enable you to design and execute single SOAP API calls. |
|
|
Web Services REST jobs enable you to design and execute single REST API calls. |
|
|
z/OS jobs are used to execute a task on a z/OS system. |