Skip to content

components llm_rag_create_promptflow

github-actions[bot] edited this page Dec 16, 2023 · 67 revisions

LLM - Create Prompt Flow

llm_rag_create_promptflow

Overview

This component is used to create a RAG flow based on your mlindex data and best prompts. The flow will look into your indexed data and give answers based on your own data context. The flow also provides the capability to bulk test with any built-in or custom evaluation flows.

Version: 0.0.58

Tags

Preview

View in Studio: https://ml.azure.com/registries/azureml/components/llm_rag_create_promptflow/version/0.0.58

Inputs

Name Description Type Default Optional Enum
best_prompts JSON file containing prompt options to create variants from. Must either have single key of 'best_prompt' with a value of a list of prompt strings or have best prompts specific to certain metrics. uri_file True
mlindex_asset_id Asset ID for MLIndex file that contains information about index to use for document lookup in promptflow uri_file
mlindex_name Name of the MLIndex asset string True
llm_connection_name Workspace connection full name for completion or chat string True
llm_config JSON describing the llm provider and model details to use for completion generation. string True
embedding_connection Workspace connection full name for embedding. string True
embeddings_model The model to use to embed data. E.g. 'hugging_face://model/sentence-transformers/all-mpnet-base-v2' or 'azure_open_ai://deployment/{deployment_name}/model/{model_name}' string True

Environment

azureml:llm-rag-embeddings@latest

Clone this wiki locally