AWS Bedrock Anthropic Integration with Orkes Conductor
To use system AI tasks in Orkes Conductor, you must integrate your Conductor cluster with the necessary AI/LLM providers. This guide explains how to integrate AWS Bedrock Anthropic with Orkes Conductor. Here’s an overview:
- Get the required credentials from AWS Bedrock Anthropic.
- Configure a new AWS Bedrock Anthropic integration in Orkes Conductor.
- Add models to the integration.
- Set access limits to the AI model to govern which applications or groups can use them.
Step 1: Get the AWS Bedrock Anthropic credentials
To integrate AWS Bedrock Anthropic with Orkes Conductor, retrieve the following credentials from your AWS account:
- AWS account ID and region
- (If assuming a role from another AWS account) Amazon Resource Name (ARN) and External ID
- (If the connection is established using the access key and secret from the AWS account) Access key and secret from AWS account.
Step 2: Add an integration for AWS Bedrock Anthropic
After obtaining the credentials, add an AWS Bedrock Anthropic integration to your Conductor cluster.
To create an AWS Bedrock Anthropic integration:
- Go to Integrations from the left navigation menu on your Conductor cluster.
- Select + New integration.
- In the AI/LLM section, choose AWS Bedrock Anthropic.
- Select + Add and enter the following parameters:
Parameters | Description | Required/Optional |
---|---|---|
Integration name | A name for the integration. | Required. |
Connection type | The connection type, depending upon how to establish the connection. Supported values:
| Required. |
Region | The valid AWS region where the resource is located. For example, us-east-1. | Required. |
Account ID | The AWS account ID. | Optional. |
Role ARN | The Amazon Resource Name (ARN) to set up the connection. | Required if the Connection Type is chosen as Assume External Role. |
External ID | The external ID that will assume the role, if applicable. External ID is used in an IAM role trust policy to designate the person who will assume the role. | Required if the Connection Type is chosen as Assume External Role. |
Access key | The access key of the AWS account. | Required if the Connection Type is chosen as Access Key/Secret. |
Access secret | The access secret of the AWS account. | Required if the Connection Type is chosen as Access Key/Secret. |
Description | A description of your integration. | Required. |
- (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
- Select Save.
Step 3: Add AWS Bedrock Anthropic models
Once you’ve integrated AWS Bedrock Anthropic, the next step is to configure specific models.
AWS Bedrock Anthropic has different models, such as Claude, Claude 3 Sonnet, Claude 3 Haiku, and more, each designed for various use cases. Choose the model that best fits your use case.
To add a model to the AWS Bedrock Anthropic integration:
- Go to the Integrations page and select the + button next to the integration created.
- Select + New model.
- Enter the Model name and a Description. Get the complete list of AWS Bedrock Anthropic models.
- (Optional) Toggle the Active button off if you don’t want to activate the model instantly.
- Select Save.
This saves the model for future use in AI tasks within Orkes Conductor.
Step 4: Set access limits to integration
Once the integration is configured, set access controls to manage which applications or groups can use the models.
To provide access to an application or group:
- Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
- Create a new group/application or select an existing one.
- In the Permissions section, select + Add Permission.
- In the Integration tab, select the required AI models and toggle the necessary permissions.
- Select Add Permissions.
The group or application can now access the AI model according to the configured permissions.
With the integration in place, you can now create workflows using AI/LLM tasks.