Integrating with Cohere in Orkes Conductor
To effectively utilize AI and LLM tasks in Orkes Conductor, it's essential to integrate your Orkes Conductor cluster with the necessary AI and LLM models.
Cohere offers a range of models that can be incorporated into the Orkes Conductor cluster. The choice of model depends on your unique use case, the functionalities you require, and the specific natural language processing tasks you intend to tackle.
This guide will provide the steps for integrating the Cohere provider with Orkes Conductor.
Steps to integrate with Cohere
Before beginning to integrate with Cohere, you need to generate the API key and get the API endpoint from the Cohere console.
To generate the API key:
- Log in to the Cohere console.
- Navigate to API keys and generate the API key.
The base URL for Cohere is https://api.cohere.ai/v1, which serves as the API Endpoint for integrating with Cohere.
Integrating with Cohere as a model provider
Let’s integrate Cohere with Orkes Conductor.
- Navigate to Integrations from the left menu on your Orkes Conductor cluster.
- Click +New integration button from the top-right corner.
- Under the AI/LLM section, choose Cohere.
- Click +Add and provide the following parameters:
Parameter | Description |
---|---|
Integration name | A name for the integration. |
API Key | The API key to integrate Cohere with Orkes Conductor. Refer to the previous section on how to generate the API keys. |
API Endpoint | The API endpoint from your Cohere console. The base URL of the API endpoint is of the format: https://api.cohere.ai/v1. |
Description | A description of your integration. |
- You can toggle-on the Active button to activate the integration instantly.
- Click Save.
Adding Cohere models to integration
You have now integrated your Orkes Conductor cluster with the Cohere provider. The next step is to integrate with the specific models. Cohere AI has different models: command, command-r, embed, and more. Each model is intended for different use cases, such as text completion and generating embeddings.
Depending on your use case, you must configure the required model within your Cohere configuration.
To add a new model to the Cohere integration:
- Navigate to the integrations page and click the '+' button next to the integration created.
- Click +New model.
- Provide the model name and an optional description for the model. The complete list of models in Cohere is available here.
- Toggle-on the Active button to enable the model immediately.
- Click Save.
This ensures the integration model is saved for future use in LLM tasks within Orkes Conductor.
RBAC - Governance on who can use Integrations
The integration with the required models is now ready. Next, we should determine the access control to these models.
The permission can be granted to applications/groups within the Orkes Conductor cluster.
To provide explicit permission to Groups:
- Navigate to Access Control > Groups from the left menu on your Orkes Conductor cluster.
- Create a new group or choose an existing group.
- Under the Permissions section, click +Add Permission.
- Under the Integrations tab, select the required integrations with the required permissions.
- Click Add Permissions. This ensures that all the group members can access these integration models in their workflows.
Similarly, you can also provide permissions to applications.
Once the integration is ready, start creating workflows with LLM tasks.