Tasks in workflows
When building workflows, you can use the built-in system tasks and operators provided by Conductor and additionally write your own custom worker tasks.
Built-in tasks
Here is an introduction of the built-in tasks available in Conductor. These tasks enable you to quickly build workflows without needing to code.
For control flow
The control structures and operations in your Conductor workflow are implemented as tasks. Here are the tasks available for managing the flow of execution:
- Conditional flow
- Switch—Execute tasks conditionally, like an if…else… statement.
- Looping flow
- Do While—Execute tasks repeatedly, like a do…while… statement.
- Parallel flows
- Fork—Execute a static number of tasks in parallel.
- Dynamic Fork—Execute a dynamic number of tasks in parallel.
- Join—Join the forks after a Fork or Dynamic Fork before proceeding to the next task.
- Start Workflow—Asynchronously start another workflow.
- Jumps or state changes in flow
- Terminate—Terminate the current workflow, like a return statement.
- Sub Workflow—Synchronously start another workflow, like a subroutine.
- Terminate Workflow—Terminate another ongoing workflow.
- Update Task—Update the status of another ongoing task.
- State querying
- Get Workflow—Get the execution details of another ongoing workflow.
- Waits in flow
- Wait—Pause the current workflow until a set time, duration, or signal is received.
- Wait for Webhook—Pause the current workflow for an incoming webhook signal.
- Human–Pause the current workflow for user input before proceeding to the next task.
- Dynamic tasks in flow
- Dynamic—Execute a task dynamically, like a function pointer.
For assigning variables
In general, variables are bounded within each task and passed along in the workflow as necessary. However, you can also handle variables or secrets at a global environment or workflow level.
- Set Variable—Create or update workflow variables.
- Update Secret—Create or update secrets in your Conductor cluster.
For execution logic
In most common cases, you can make use of existing Conductor features instead of creating a custom worker from scratch. These include tasks for data transformation, user journeys, and LLM chaining.
Use Case | Task to Use |
---|---|
Call an API or HTTP endpoint | HTTP |
Poll an API or HTTP endpoint | HTTP Poll |
Publish or consume events | Event |
Clean or transform JSON data | JSON JQ |
Modify SQL databases | JDBC |
Execute JavaScript scripts | Inline |
Evaluate and retrieve data in spreadsheets | Business Rule |
Get authorized using a signed JWT | Get Signed JWT |
Orchestrate human input in the loop | Human |
Query data from Conductor Search API or Metrics | Query Processor |
Send alerts to Opsgenie | Opsgenie |
Retrieve text or JSON content from a URL | Get Document |
Generate text embeddings | Generate Embeddings |
Store text embeddings in a vector database | Store Embeddings |
Generate and store text embeddings in a vector database | Index Text |
Chunk, generate, and store text embeddings in a vector database | Index Document |
Retrieve data from a vector database | Get Embeddings |
Retrieve data from a vector database based on a search query | Search Index |
Generate text from an LLM based on a defined prompt | Text Complete |
Generate text from an LLM based on a user query and additional system/assistant instructions | Chat Complete |
Task definition
A task definition specifies a task’s general implementation details, like the expected input and output keys, and failure-handling configurations, like rate limits, retries, and timeouts. This definition applies to all instances of the task across workflows.
For all custom worker tasks, a task definition needs to be added to the Conductor server before it can be used in a workflow. While system tasks do not require a task definition, it can be created to so that task’s failure handling settings can be configured.
All task definitions are stored as JSON. These parameters can be updated in real time without needing to redeploy your application.
Example
Here is an example task definition JSON:
{
"createTime": 1721901586970,
"updateTime": 1725926875230,
"createdBy": "user@acme.com",
"updatedBy": "user@acme.com",
"name": "calculate-fx",
"description": "Calculates currency exchange",
"retryCount": 0,
"timeoutSeconds": 3600,
"inputKeys": [],
"outputKeys": [],
"timeoutPolicy": "TIME_OUT_WF",
"retryLogic": "EXPONENTIAL_BACKOFF",
"retryDelaySeconds": 30,
"responseTimeoutSeconds": 600,
"concurrentExecLimit": 20,
"inputTemplate": {},
"rateLimitPerFrequency": 10,
"rateLimitFrequencyInSeconds": 1,
"ownerEmail": "user@acme.com",
"pollTimeoutSeconds": 3600,
"backoffScaleFactor": 1,
"enforceSchema": false
}
Task configuration
The task configuration is part of the workflow definition. It specifies the workflow-specific implementation details, like the task reference name, task type, task input parameters, and so on.
Although each task type has its unique configuration, all tasks share several parameters in common. Refer to the Task Reference to learn more about the task configuration for each task type.
Common configuration parameters
Parameter | Description | Required/ Optional |
---|---|---|
name | Name of the task. The default value is the same as the task type. The name can be changed to something descriptive. To use a given task definition, the task name here must match the task definition name (case-sensitive). Note: It is recommended to use alphanumeric characters for task names. While special characters are allowed for backward compatibility, they are not fully supported and may cause unexpected behavior. | Required. |
taskReferenceName | Reference name for the task. Must be a unique value in a given workflow. | Required. |
type | The task type. For example, HTTP, SIMPLE. | Required. |
inputParameters | Map of the task’s input parameters. | Depends on the task type. |
optional | Whether the task is optional. If set to true, the workflow continues to the next task even if this task fails or remains incomplete. | Optional. |
asyncComplete | Whether the task is completed asynchronously. The default value is false. Supported values:
| Optional. |
startDelay | The time in seconds to wait before making the task available for worker polling. The default value is 0. | Optional. |
onStateChange | Configuration for publishing an event or starting another task when this task status changes. | Optional. |
Passing data between tasks
Data can be passed from one task to another, by using dynamic references in a task’s inputParameters
. These dynamic references are formatted as JSONPath expressions. Refer to Wiring Parameters to learn more.
Task reuse
Since task workers typically perform a unit of work as part of a larger workflow, Conductor’s infrastructure is built to enable task reusability out of the box. Once a task is defined in Conductor, it can be reused numerous times:
- In the same workflow, using different task reference names.
- Across various workflows.
When reusing tasks, it's important to think of situations that a multi-tenant system faces. By default, all the work assigned to this worker goes into the same task queue. This could result in your worker not being polled quickly if there is a noisy neighbor in the ecosystem. You can tackle this situation by scaling up the number of workers to handle the task load, or even using task-to-domain to route the task load into separate queues.