AWS Bedrock Titan Integration with Orkes Conductor
To use system AI tasks in Orkes Conductor, you must integrate your Conductor cluster with the necessary AI providers. This guide explains how to integrate AWS Bedrock Titan with Orkes Conductor. Here’s an overview:
- Get the required credentials from AWS Bedrock Titan.
- Configure a new AWS Bedrock Titan integration in Orkes Conductor.
- Add models to the integration.
- Set access limits to the AI model to govern which applications or groups can use it.
Step 1: Get the AWS Bedrock Titan credentials
To integrate AWS Bedrock Titan with Orkes Conductor, retrieve one of the following credential sets from your AWS account, depending on how you choose to connect to Conductor:
- AWS account ID and region - Use this option when assuming a role from the same AWS account.
- Amazon Resource Name (ARN) and External ID - Use this option when authenticating directly with AWS access keys.
- Access key and secret from AWS account - Use this option when assuming a role from a different AWS account.
Step 2: Add an integration for AWS Bedrock Titan
After obtaining the credentials, add an AWS Bedrock Titan integration to your Conductor cluster.
To create an AWS Bedrock Titan integration:
- Go to Integrations from the left navigation menu on your Conductor cluster.
- Select + New integration.
- In the AI/LLM section, choose AWS Bedrock Titan.
- Select + Add and enter the following parameters:
| Parameters | Description | Required/Optional |
|---|---|---|
| Integration name | A name for the integration. | Required. |
| Connection type | The connection type, depending upon how to establish the connection. Supported values:
| Required. |
| Region | The valid AWS region where the resource is located. For example, us-east-1. | Required. |
| Account ID | The AWS account ID. | Optional. |
| Role ARN | The Amazon Resource Name (ARN) to set up the connection. | Required if the Connection Type is chosen as Assume External Role. |
| External ID | The external ID that will assume the role, if applicable. External ID is used in an IAM role trust policy to designate the person who will assume the role. | Required if the Connection Type is chosen as Assume External Role. |
| Access key | The access key of the AWS account. | Required if the Connection Type is chosen as Access Key/Secret. |
| Access secret | The access secret of the AWS account. | Required if the Connection Type is chosen as Access Key/Secret. |
| Description | A description of your integration. | Required. |

- (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
- Select Save.
Step 3: Add AWS Bedrock Titan models
Once you’ve integrated AWS Bedrock Titan, the next step is to configure specific models. AWS Bedrock Titan has different models, each designed for various use cases. Choose the model that best fits your use case.
To add a model to the AWS Bedrock Titan integration:
- Go to Integrations and select the + button next to the integration created.

- Select + New model.
- Enter the Model name. The name must exactly match the AWS Bedrock Titan model name. For a complete list, see the AWS Bedrock Titan documentation.
- Provide a Description.

- (Optional) Toggle the Active button off if you don’t want to activate the model instantly.
- Select Save.
This saves the model for future use in AI tasks within Orkes Conductor.
Step 4: Set access limits to integration
Once the integration is configured, set access controls to manage which applications or groups can use the models.
To provide access to an application or group:
- Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
- Create a new group/application or select an existing one.
- In the Permissions section, select + Add Permission.
- In the Integration tab, select the required AI models and toggle the necessary permissions.
- Select Add Permissions.

The group or application can now access the AI model according to the configured permissions.
With the integration in place, you can now create workflows using AI/LLM tasks.