Skip to main content

AI Prompt Templates

Orkes Conductor enables you to create, refine, and securely share the prompts your organization is developing as part of the business logic for which you are employing LLMs for. This essential component of an AI application is managed with precise access controls, allowing you to determine which models they can be associated with and which teams can incorporate them into their workflows.

The AI prompts can be created in the Orkes Conductor cluster and can be used in LLM tasks within your workflows.


Prompt NameA name for the prompt.
DescriptionA description for the prompt.
Prompt TemplateInput the prompt template. A prompt can be input text/context, instructions, questions, and more. The prompt is to be generated and fine-tuned depending on the context.

One key feature of prompt templates is the variables you can put inside a prompt. Those are considered variables and have the effect of using the prompt similar to an API interface. While defining a workflow, when configuring the system task where this prompt template will be used, these placeholders will be associated with specific variables available in a workflow. At runtime, the placeholders will be replaced with the actual values before the prompt is sent to the LLM.

E.g., if your prompt is What is the current population of ${country}? What was the population in ${year}? Here, we have given two placeholders, country and year, which can be associated with any variable in the workflow where this prompt is used.
Select model to testFrom the chosen LLM models, you can choose any model for testing the prompt.
TemperatureSet the required temperature for testing the prompt.
Stop WordsProvide the stop words to be filtered out.
TopPSet the required TopP value for testing the prompt.

For a detailed explanation of each parameter and how to provide access control for prompts to user groups, refer to the developer guide on creating and managing AI prompt templates