From 90a613db917e422ed665a5c68c61ee257ce55910 Mon Sep 17 00:00:00 2001 From: Colleen McGinnis Date: Tue, 24 Sep 2024 13:28:17 -0500 Subject: [PATCH] add note about private LLMs and free-tier --- docs/en/serverless/ai-assistant/ai-assistant.mdx | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/docs/en/serverless/ai-assistant/ai-assistant.mdx b/docs/en/serverless/ai-assistant/ai-assistant.mdx index b7ab9893f2..152ae98cbf 100644 --- a/docs/en/serverless/ai-assistant/ai-assistant.mdx +++ b/docs/en/serverless/ai-assistant/ai-assistant.mdx @@ -41,6 +41,13 @@ The AI assistant requires the following: * AWS Bedrock, specifically the Anthropic Claude models. * The knowledge base requires a 4 GB ((ml)) node. + + The free tier offered by third-party generative AI providers may not be sufficient for the proper functioning of the AI assistant. + In most cases, a paid subscription to one of the supported providers is required. + The Observability AI assistant doesn't support connecting to a private LLM. + Elastic doesn't recommend using private LLMs with the Observability AI assistant. + + ## Your data and the AI Assistant Elastic does not use customer data for model training. This includes anything you send the model, such as alert or event data, detection rule configurations, queries, and prompts. However, any data you provide to the AI Assistant will be processed by the third-party provider you chose when setting up the OpenAI connector as part of the assistant setup.