Skip to main content

LLM Search Index

A system task to search the vector database or repository of vector embeddings of already processed and indexed documents to get the closest match. You can input a query that typically refers to a question, statement, or request made in natural language that is used to search, retrieve, or manipulate data stored in a database.

For example, in a recommendation system, a user might issue a query to find products similar to one they've recently purchased. The query vector would represent the purchased product, and the database would return a list of products with similar vectors, which are likely to be related or recommended to the user.

Definitions

{
"name": "llm_search_index_task",
"taskReferenceName": "llm_search_index_task_ref",
"inputParameters": {
"vectorDB": "pineconedb",
"namespace": "myNewModel",
"index": "test",
"llmProvider": "azure_openai",
"embeddingModel": "text-davinci-003",
"query": "What is an LLM model?"
},
"type": "LLM_SEARCH_INDEX"
}

Input Parameters

ParameterDescription
vectorDBChoose the required vector database.

Note:If you haven’t configured the vector database on your Orkes console, navigate to the Integrations tab and configure your required provider. Refer to this doc on how to integrate Vector Databases with Orkes console.
namespaceChoose from the available namespace configured within the chosen vector database.

Namespaces are separate isolated environments within the database to manage and organize vector data effectively.

Note:Namespace field is applicable only for Pinecone integration and is not applicable to Weaviate integration.
indexChoose the index in your vector database where indexed text or data was stored.

Note:For Weaviate integration, this field refers to the class name, while in Pinecone integration, it denotes the index name itself.
llmProviderChoose the required LLM provider configured.

Note:If you haven’t configured your AI / LLM provider on your Orkes console, navigate to the Integrations tab and configure your required provider. Refer to this doc on how to integrate the LLM providers with Orkes console.
modelChoose from the available language model configured for the chosen LLM provider.

For example, If your LLM provider is Azure Open AI & you’ve configured text-davinci-003 as the language model, you can choose it under this field.
queryProvide your search query. A query typically refers to a question, statement, or request made in natural language that is used to search, retrieve, or manipulate data stored in a database.

Output Parameters

AttributeDescription
scoreRepresents a value that quantifies the degree of likeness between a specific item and a query vector, facilitating the ranking and ordering of results. Higher scores denote a stronger resemblance or relevance of a data point to the query vector.
docIdDisplays the docId from where the text is queried.

Examples



  1. Add task type LLM Search Index.
  2. Choose the vector database, & LLM provider.
  3. Provide the search query.

LLM Search Index Task