Associate Prompt with Integration Model
Endpoint: POST /api/integrations/provider/{integration_provider}/integration/{integration_name}/prompt/{prompt_name}
Associates an existing prompt with a specific model under an integration provider. The prompt, model, and AI/LLM integration must all exist before calling this endpoint.
Path parameters
| Parameter | Description | Type | Required/ Optional |
|---|---|---|---|
| integration_provider | The name of the integration provider in Conductor to which the prompt is to be added. | string | Required. |
| integration_name | The name of the model. | string | Required. |
| prompt_name | The name of the prompt to associate with the integration. | string | Required. |
Response
| Status | Description |
|---|---|
| 200 OK | Indicates that the resource is created/updated successfully. |
| 403 Forbidden | Indicates that the authenticated user does not have READ or UPDATE access on the prompt. |
| 404 Not Found | The integration provider, integration, or prompt does not exist. |
Examples
Associate a prompt with an integration provider
Request
curl -X 'POST' \
'https://<YOUR-SERVER-URL>/api/integrations/provider/openAI/integration/gpt-4o/prompt/population-prompt' \
-H 'accept: */*' \
-H 'X-Authorization: <TOKEN>' \
-d ''
Response
Returns 200 OK, indicating that the prompt is associated with the integration provider.