Skip to main content

Integrating with AWS Bedrock Anthropic in Orkes Conductor

To effectively utilize AI and LLM tasks in Orkes Conductor, it's essential to integrate your Conductor cluster with the necessary AI and LLM models.

AWS Bedrock Anthropic offers a range of models that can be incorporated into the Orkes Conductor console. The choice of model depends on your unique use case, the functionalities you require, and the specific natural language processing tasks you intend to tackle.

This guide will provide the steps for integrating the AWS Bedrock Anthropic provider with Orkes Conductor.

Steps to integrate with AWS Bedrock Anthropic

Before beginning the integration process in Orkes Conductor, you must get specific configuration credentials from your AWS account.

  • AWS account ID & region where the resource is located.
  • Amazon Resource Name (ARN) to set up the connection.
  • External ID - When you assume a role belonging to another account in AWS, you need to provide the external ID, which can be used in an IAM role trust policy to designate the person to assume the role. Learn more.
  • Access key and secret from AWS account.

Integrating with AWS Bedrock Anthropic as a model provider

Let’s integrate AWS Bedrock Anthropic with Orkes Conductor.

  1. Navigate to Integrations from the left menu on your Orkes Conductor console.
  2. Click +New integration button from the top-right of your window.
  3. Under the AI/LLM section, choose AWS Bedrock Anthropic.
  4. Click +Add and provide the following parameters:

Create AWS Bedrock Anthropic Integration

ParametersDescription
Integration nameProvide a name for the integration.
Connection typeChoose the required connection type. Depending upon how the connection is to be established, it can take the following values:
  • Current Conductor Role - Choose this if you are using the current Conductor role to establish the connection.
  • Assume External Role - Choose this if you are assuming a role belonging to another AWS account. Learn more.
  • Access Key/Secret - Choose this if you are establishing the connection using the access key and secret.
RegionProvide the valid AWS region where the resource is located.
Account IDProvide your AWS account ID. This field is optional.
Role ARNSpecify the Amazon Resource Name (ARN) required to set up the connection.

Note: This field is applicable only if the Connection Type is chosen as Assume External Role.
External IDIf applicable, provide the external ID to assume the role.

Note: This field is applicable only if the Connection Type is chosen as Assume External Role.
Access keyProvide the AWS access key.

Note: This field is applicable only if the Connection Type is chosen as Access Key/Secret.
Access secretProvide the AWS access secret.

Note: This field is applicable only if the Connection Type is chosen as Access Key/Secret.
DescriptionProvide a description of your integration.
  1. You can toggle-on the Active button to activate the integration instantly.
  2. Click Save.

Adding AWS Bedrock Anthropic models to integration

You have now integrated your Conductor console with the AWS Bedrock Anthropic provider. The next step is to integrate with the specific models. AWS Bedrock Anthropic has different models: Claude, Claude 3 Sonnet, Claude 3 Haiku, Claude Instant, and more. Each model is intended for different use cases, such as text completion and generating embeddings.

Depending on your use case, you must configure the required model within your AWS Bedrock Anthropic configuration.

To add a new model to the AWS Bedrock Anthropic integration:

  1. Navigate to the integrations page and click the '+' button next to the integration created.

Create AWS Bedrock Anthropic Integration Model from Listed Integrations

  1. Click +New model.
  2. Provide the model name and an optional description. The complete list of models in AWS Bedrock Anthropic is available here.

Create AWS Bedrock Anthropic Integration Model

  1. Toggle-on the Active button to enable the model immediately.
  2. Click Save.

This ensures the integration model is saved for future use in LLM tasks within Orkes Conductor.

RBAC - Governance on who can use Integrations

The integration with the required models is now ready. Next, we should determine the access control to these models.

The permission can be granted to applications/groups within the Orkes Conductor console.

To provide explicit permission to Groups:

  1. Navigate to Access Control > Groups from the left menu on your Orkes Conductor console.
  2. Create a new group or choose an existing group.
  3. Under the Permissions section, click +Add Permission.
  4. Under the Integrations tab, select the required integrations with the required permissions.

Add Permissions for Integrations

  1. Click Add Permissions. This ensures that all the group members can access these integration models in their workflows.

Similarly, you can also provide permissions to applications.

note

Once the integration is ready, start creating workflows with LLM tasks.