Skip to main content

Google Gemini AI Integration with Orkes Conductor

To use system AI tasks in Orkes Conductor, you must integrate your Conductor cluster with the necessary AI/LLM providers. This guide explains how to integrate Google Gemini AI with Orkes Conductor. Here’s an overview:

  1. Get the required credentials from Google Gemini AI.
  2. Configure a new Google Gemini AI integration in Orkes Conductor.
  3. Add models to the integration.
  4. Set access limits to the AI model to govern which applications or groups can use them.

Step 1: Get the Google Gemini AI credentials

To integrate Google Gemini AI with Orkes Conductor, retrieve the project ID and service account JSON from the Google Cloud console.

Get the project ID

To get the project ID:

  1. Sign in to the Google Cloud Console.
  2. Create a new project or select an existing one.
  3. Get the Project ID from the dashboard.

Get project ID from Google Cloud Console

For more information, refer to the official documentation on creating and managing projects in GCP.

Get the service account JSON

To get the service account JSON:

  1. Go to IAM & Admin > Service Accounts from the left menu on your GCP console.
  2. Create a new service or select an existing one.
  3. In the KEYS tab, select ADD KEY > Create new key.

Get Service Account JSON from Google Cloud Console

  1. Select the key type as JSON.
  2. Select Create to download the JSON file.

Get Service Account JSON key from Google Cloud Console

To use Google Gemini AI with Orkes Conductor, you must enable the Gemini API from the GCP console.

Enable Gemini API

To enable Gemini API:

  1. Go to APIs & Services > Enabled APIs & services from the left menu on your GCP console.
  2. Select + ENABLE APIS AND SERVICES.

Enabling APIs and services from GCP console

  1. In the API Library, search for Gemini API.

Searching for Gemini API in API Library

  1. Select ENABLE.

Enabling Gemini API

Once enabled, the Gemini API is ready for use with your GCP project.

Step 2: Add an integration for Google Gemini AI

After obtaining the credentials, add a Google Gemini AI integration to your Conductor cluster.

To create a Google Gemini AI integration:

  1. Go to Integrations from the left navigation menu on your Conductor cluster.
  2. Select + New integration.
  3. In the AI/LLM section, choose Google Gemini AI.
  4. Select + Add and enter the following parameters:
ParameterDescription
Integration nameA name for the integration.
Project IDThe Project ID retrieved from the GCP console.
LocationThe Google Cloud region of your GCP account.
Choose Service account credentials JSONUpload the service account JSON file, which is a key file containing the credentials for authenticating the Orkes Conductor cluster with the GCP services.
DescriptionA description of your integration.

Google Gemini AI Integration with Orkes Conductor

  1. (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
  2. Select Save.

Step 3: Add Google Gemini AI models

Once you’ve integrated Google Gemini AI, the next step is to configure specific models.

Google Gemini AI has different models, such as Gemini 2.0 Flash, Gemini 1.5 Flash, and more, each designed for various use cases. Choose the model that best fits your use case.

To add a model to the Google Gemini AI integration:

  1. Go to the Integrations page and select the + button next to the integration created.

Create Google Gemini AI Integration Model from Listed Integrations

  1. Select + New model.
  2. Enter the Model name and a Description. Get the complete list of Gemini AI models.

Create Google Gemini AI Integration Model

  1. (Optional) Toggle the Active button off if you don’t want to activate the model instantly.
  2. Select Save.

This saves the model for future use in AI tasks within Orkes Conductor.

Step 4: Set access limits to integration

Once the integration is configured, set access controls to manage which applications or groups can use the models.

To provide access to an application or group:

  1. Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
  2. Create a new group/application or select an existing one.
  3. In the Permissions section, select + Add Permission.
  4. In the Integration tab, select the required AI models and toggle the necessary permissions.
  5. Select Add Permissions.

Add Permissions for Integrations

  1. Select Add Permissions.

The group or application can now access the AI model according to the configured permissions.

With the integration in place, you can now create workflows using AI/LLM tasks.

More resources