Skip to main content

Confluent Kafka Integration with Orkes Conductor

note

The Confluent Kafka configuration is deprecated. For new configurations, use Apache Kafka.

To use the Event task or enable Change Data Capture (CDC) in Orkes Conductor, you must integrate your Conductor cluster with the necessary message brokers. This guide explains how to integrate Confluent Kafka with Orkes Conductor to publish and receive messages from topics. Here’s an overview:

  1. Get the required credentials from Confluent Kafka.
  2. Configure a new Confluent Kafka integration in Orkes Conductor.
  3. Set access limits to the message broker to govern which applications or groups can use them.

Step 1: Get the Confluent Kafka credentials

To integrate Confluent Kafka with Orkes Conductor, retrieve the following credentials from the Confluent Cloud portal:

  • API keys
  • Bootstrap server
  • Schema registry URL

Get the API keys

To retrieve the API keys:

  1. Sign in to the Confluent Cloud portal.
  2. Select the Confluent cluster to integrate with Orkes Conductor.
  3. Go to Cluster Overview > API Keys from the left navigation menu.
  4. Select Create Key > + Add key.
  5. Choose either Global access or Granular access.
  6. Copy and store the Key and Secret.

Generating API Keys from Confluent Cloud

Get the Bootstrap server

To retrieve the Bootstrap server:

  1. Sign in to the Confluent Cloud portal.
  2. Go to Cluster Overview > Cluster Settings > Endpoints.
  3. Copy the Bootstrap server.

Getting Bootstrap token from Confluent Cloud

  1. Go to Topics and identify the Topic name to use for this integration.

Topics in Confluent Cloud

Get the Schema registry server

The Schema registry server, API key, and secret are only required if you are integrating with a schema registry (for AVRO protocol).

To get the Schema registry server and API keys:

  1. Sign in to the Confluent Cloud portal.
  2. Go to Clients > Add new client.
  3. In Copy the configuration snippet for your clients > schema.registry.url, copy the URL.
  4. Select Create Schema Registry API key to download the file. The downloaded file will have the Schema Registry API key and secret.

Getting Schema Registry URL

Step 2: Add an integration for Confluent Kafka

After obtaining the credentials, add a Confluent Kafka integration to your Conductor cluster.

To create a Confluent Kafka integration:

  1. Go to Integrations from the left navigation menu on your Conductor cluster.
  2. Select + New integration.
  3. In the Message Broker section, choose Confluent Kafka.
  4. Select + Add and enter the following parameters:
ParemetersDescriptionRequired / Optional
Integration nameA name for the integration.Required.
Bootstrap ServerThe bootstrap server of the Confluent Kafka cluster.Required.
Sending ProtocolThe sending protocol for the integration. Supported values:
  • String–Sends messages as plain string data.
  • AVRO–Serializes messages using AVRO. To use a schema registry, select AVRO.
Required.
Connection SecurityThe security mechanism for connecting to the Kafka cluster. Supported values:
  • SASL_SSL / PLAIN–Secure connection using SASL (Simple Authentication and Security Layer) with SSL encryption.
  • SASL_SSL / SCRAM-SHA-256 / JKS–Secure connection using SASL with SCRAM-SHA-256 authentication and SSL encryption.
Required.
Choose Trust Store fileUpload the Java JKS trust store file with CAs.Required if Connection Security is SASL_SSL / SCRAM-SHA-256 / JKS.
Trust Store PasswordThe password for the trust store file.Required if Connection Security is SASL_SSL / SCRAM-SHA-256 / JKS.
UsernameThe username to authenticate with the Kafka cluster.
Note: For AVRO configuration, use the previously-copied API key as the username.
Required.
PasswordThe password associated with the username.
Note: For AVRO configuration, use the previously-copied API secret as the password.
Required.
Schema Registry URLThe Schema Registry URL copied from the Confluent Kafka console.Required if Sending Protocol is AVRO.
Schema Registry Auth TypeThe authentication mechanism for connecting to the schema registry. Supported values:
  • Password in URL
  • Schema Registry User Info (Key/Password)
  • NONE
Required if Sending Protocol is AVRO.
Schema Registry API KeyThe schema registry API key obtained from the schema registry server.Required if
  • Sending Protocol is AVRO.
  • Schema Registry Auth Type is Schema Registry User Info (Key/Password).
Schema Registry API SecretThe schema registry API secret obtained from the schema registry server.Required if
  • Sending Protocol is AVRO.
  • Schema Registry Auth Type is Schema Registry User Info (Key/Password).
Value Subject Name StrategyThe strategy for constructing the subject name under which the AVRO schema will be registered in the schema registry. Supported values:
  • io.confluent.kafka.serializers.subject.TopicNameStrategy
  • io.confluent.kafka.serializers.subject.RecordNameStrategy
  • io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
Required if Sending Protocol is AVRO.
Consumer Group IDThe Consumer Group ID from Kafka. This unique identifier helps manage message processing, load balancing, and fault tolerance within consumer groups.Required.
DescriptionA description of the integration.Required.

Confluent Kafka Integration with Orkes Conductor

  1. (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
  2. Select Save.

Step 3: Set access limits to integration

Once the integration is configured, set access controls to manage which applications or groups can use the message broker.

To provide access to an application or group:

  1. Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
  2. Create a new group/application or select an existing one.
  3. In the Permissions section, select + Add Permission.
  4. In the Integration tab, select the required message broker and toggle the necessary permissions.

Configuring RBAC for Confluent Kafka Integration

The group or application can now access the message broker according to the configured permissions.

Next steps

With the integration in place, you can now: