diff --git a/docs/en/observability/observability-ai-assistant.asciidoc b/docs/en/observability/observability-ai-assistant.asciidoc index daf0eac7d7..b8ab278565 100644 --- a/docs/en/observability/observability-ai-assistant.asciidoc +++ b/docs/en/observability/observability-ai-assistant.asciidoc @@ -1,8 +1,6 @@ [[obs-ai-assistant]] = Observability AI Assistant -preview::[] - The AI Assistant uses generative AI, powered by a {kibana-ref}/openai-action-type.html[connector] for OpenAI or Azure OpenAI Service, to provide: * *Contextual insights* — open prompts throughout {observability} that explain errors and messages and suggest remediation. @@ -13,8 +11,14 @@ image::images/obs-assistant2.gif[Observability AI assistant preview] [IMPORTANT] ==== -The Observability AI Assistant is in technical preview, and its capabilities are still developing. Users should leverage it sensibly as the reliability of its responses might vary. Always cross-verify any returned advice for accurate threat detection and response, insights, and query generation. +The AI Assistant is powered by an integration with your large language model (LLM) provider. +LLMs are known to sometimes present incorrect information as if it's correct. +Elastic supports configuration and connection to the LLM provider and your knowledge base, +but is not responsible for the LLM's responses. +==== +[IMPORTANT] +==== Also, the data you provide to the Observability AI assistant is _not_ anonymized, and is stored and processed by the third-party AI provider. This includes any data used in conversations for analysis or context, such as alert or event data, detection rule configurations, and queries. Therefore, be careful about sharing any confidential or sensitive details while using this feature. ==== @@ -59,13 +63,52 @@ To set up the AI Assistant: [[obs-ai-add-data]] == Add data to the AI Assistant knowledge base +[IMPORTANT] +==== +*If you started using the AI Assistant in technical preview*, +any knowledge base articles you created before 8.12 will have to be reindexed or upgraded before they can be used. +Knowledge base articles created before 8.12 use ELSER v1. +In 8.12, knowledge base articles must use ELSER v2. +You can either: + +* Clear all old knowledge base articles manually and reindex them. +* Upgrade all knowledge base articles indexed with ELSER v1 to ELSER v2 using a https://github.com/elastic/elasticsearch-labs/blob/main/notebooks/model-upgrades/upgrading-index-to-use-elser.ipynb[Python script]. +==== + The AI Assistant uses {ml-docs}/ml-nlp-elser.html[ELSER], Elastic's semantic search engine, to recall data from its internal knowledge base index to create retrieval augmented generation (RAG) responses. Adding data such as Runbooks, GitHub issues, internal documentation, and Slack messages to the knowledge base gives the AI Assistant context to provide more specific assistance. NOTE: Your AI provider may collect telemetry when using the AI Assistant. Contact your AI provider for information on how data is collected. You can add information to the knowledge base by asking the AI Assistant to remember something while chatting (for example, "remember this for next time"). The assistant will create a summary of the information and add it to the knowledge base. -You can also add external data to the knowledge base by completing the following steps: +You can also add external data to the knowledge base either in {kib} using the Stack Management UI or using the {es} Index API. + +[discrete] +[[obs-ai-stack-management]] +=== Use the UI + +To add external data to the knowledge base in {kib}: + +. Go to *Stack Management*. +. In the _Kibana_ section, click *AI Assistants*. +. Then select the *Elastic AI Assistant for Observability*. +. Switch to the *Knowledge base* tab. +. Click the *New entry* button, and choose either: ++ +** *Single entry*: Write content for a single entry in the UI. +** *Bulk import*: Upload a newline delimited JSON (`ndjson`) file containing a list of entries to add to the knowledge base. Each object should conform to the following format: ++ +[source,json] +---- +{ + "id": "a_unique_human_readable_id", + "text": "Contents of item", +} +---- + +[discrete] +[[obs-ai-index-api]] +=== Use the {es} Index API . Ingest external data (GitHub issues, Markdown files, Jira tickets, text files, etc.) into {es} using the {es} {ref}/docs-index_.html[Index API]. . Reindex your data into the AI Assistant's knowledge base index by completing the following query in *Management* -> *Dev Tools* in {kib}. Update the following fields before reindexing: @@ -106,7 +149,11 @@ POST _reindex [[obs-ai-interact]] == Interact with the AI Assistant -You can chat with the AI Assistant or interact with contextual prompts located throughout {observability}. See the following sections for more on interacting with the AI Assistant. +You can chat with the AI Assistant or interact with contextual insights located throughout {observability}. +See the following sections for more on interacting with the AI Assistant. + +TIP: After every answer the LLM provides, let us know if the answer was helpful. +Your feedback helps us improve the AI Assistant! [discrete] [[obs-ai-chat]] @@ -122,6 +169,12 @@ This opens the AI Assistant flyout, where you can ask the assistant questions ab [role="screenshot"] image::images/obs-ai-chat.png[Observability AI assistant chat, 60%] +[discrete] +[[obs-ai-functions]] +=== AI Assistant functions + +beta::[] + The AI Assistant uses functions to include relevant context in the chat conversation through text, data, and visual components. Both you and the AI Assistant can suggest functions. You can also edit the AI Assistant's function suggestions and inspect function responses. The following table lists available functions: @@ -163,3 +216,15 @@ Clicking a prompt generates a message specific to that log entry: image::images/obs-ai-logs.gif[Observability AI assistant example, 75%] You can continue a conversation from a contextual prompt by clicking *Start chat* to open the AI Assistant chat. + +[discrete] +[[obs-ai-known-issues]] +== Known issues + +[discrete] +[[obs-ai-token-limits]] +=== Token limits + +Most LLMs have a set number of tokens they can manage in single a conversation. +When you reach the token limit, the LLM will throw an error, and Elastic will display a "Token limit reached" error in Kibana. +The exact number of tokens that the LLM can support depends on the LLM provider and model you're using.