Grok Integration with Orkes Conductor
To use system AI tasks in Orkes Conductor, you must integrate your Conductor cluster with the necessary AI/LLM providers. This guide explains how to integrate Grok with Orkes Conductor. Here’s an overview:
- Get the required credentials from Grok.
- Configure a new Grok integration in Orkes Conductor.
- Add models to the integration.
- Set access limits to the AI model to govern which applications or groups can use them.
Step 1: Get the Grok credentials
To integrate Grok with Orkes Conductor, retrieve the API key from the Grok console.
To get the API key:
- Sign in to the Grok console.
- Go to API Keys and select Create API Key.
- Enter a Name and select Create.
- Copy and store the generated key securely, as it is shown only once.
Step 2: Add an integration for Grok
After obtaining the credentials, add a Grok integration to your Conductor cluster.
To create a Grok integration:
- Go to Integrations from the left navigation menu on your Conductor cluster.
- Select + New integration.
- In the AI/LLM section, choose Grok.
- Select + Add and enter the following parameters:
Parameters | Description |
---|---|
Integration name | A name for the integration. |
API Key | The API key copied previously from the Grok platform. |
Description | A description of the integration. |
- (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
- Select Save.
Step 3: Add Grok models
Once you’ve integrated Grok, the next step is to configure specific models.
Grok has different models, such as grok-beta, grok-vision-beta, grok-2-vision-1212, and more, each designed for various use cases. Choose the model that best fits your use case.
To add a model to the Grok integration:
- Go to the Integrations and select the + button next to the integration created.
- Select + New model.
- Enter the Model name and a Description. Get the complete list of Grok models.
- (Optional) Toggle the Active button off if you don’t want to activate the model instantly.
- Select Save.
This saves the model for future use in AI tasks within Orkes Conductor.
Step 4: Set access limits to integration
Once the integration is configured, set access controls to manage which applications or groups can use the models.
To provide access to an application or group:
- Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
- Create a new group/application or select an existing one.
- In the Permissions section, select + Add Permission.
- In the Integration tab, select the required AI models and toggle the necessary permissions.
- Select Add Permissions.
The group or application can now access the AI model according to the configured permissions.
With the integration in place, you can now create workflows using AI/LLM tasks.