Skip to main content

Mistral Integration with Orkes Conductor

To use system AI tasks in Orkes Conductor, you must integrate your Conductor cluster with the necessary AI/LLM providers. This guide explains how to integrate Mistral with Orkes Conductor. Here’s an overview:

  1. Get the required credentials from Mistral.
  2. Configure a new Mistral integration in Orkes Conductor.
  3. Add models to the integration.
  4. Set access limits to the AI model to govern which applications or groups can use them.

Step 1: Get the Mistral credentials

To integrate Mistral with Orkes Conductor, retrieve the API key and endpoint from the Mistral console.

To get the API key:

  1. Sign in to the Mistral console.
  2. Go to API > API Keys from the left menu.
  3. Select Create new key.

API key creation from Mistral

  1. Enter a Key name.
  2. (Optional) Set an Expiration for the key.

API key generation from Mistral

  1. Select Create new key.
  2. Copy and store the generated key.

The default API endpoint for Mistral is https://api.mistral.ai/v1. Use this as the API endpoint when configuring the integration.

Step 2: Add an integration for Mistral

After obtaining the credentials, add a Mistral integration to your Conductor cluster.

To create a Mistral integration:

  1. Go to Integrations from the left navigation menu on your Conductor cluster.
  2. Select + New integration.
  3. In the AI/LLM section, choose Mistral.
  4. Select + Add and enter the following parameters:
ParametersDescription
Integration nameA name for the integration.
API KeyThe API key copied previously from the Mistral console.
API EndpointThe default API endpoint for Mistral, which is https://api.mistral.ai/v1.
DescriptionA description of the integration.

Mistral Integration with Orkes Conductor

  1. (Optional) Toggle the Active button off if you don’t want to activate the integration instantly.
  2. Select Save.

Step 3: Add Mistral models

Once you’ve integrated Mistral, the next step is to configure specific models.

Mistral has different models, such as Mistral 7B, Mixtral 8x7B, Mixtral 8x22B, and more, each designed for various use cases, such as text completion and embedding generation. Choose the model that best fits your use case.

To add a model to the Mistral integration:

  1. Go to the Integrations page and select the + button next to the integration created.

Create Mistral Integration Model from Listed Integrations

  1. Select + New model.
  2. Enter the Model name and a Description. Get the complete list of Mistral models.

Create Mistral Integration Model

  1. (Optional) Toggle the Active button off if you don’t want to activate the model instantly.
  2. Select Save.

This saves the model for future use in AI tasks within Orkes Conductor.

Step 4: Set access limits to integration

Once the integration is configured, set access controls to manage which applications or groups can use the models.

To provide access to an application or group:

  1. Go to Access Control > Applications or Groups from the left navigation menu on your Conductor cluster.
  2. Create a new group/application or select an existing one.
  3. In the Permissions section, select + Add Permission.
  4. In the Integration tab, select the required AI models and toggle the necessary permissions.
  5. Select Add Permissions.

Add Permissions for Integrations

The group or application can now access the AI model according to the configured permissions.

With the integration in place, you can now create workflows using AI/LLM tasks.

More resources