Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI Assistant docs updates for 8.16 #4326

Merged
merged 35 commits into from
Oct 16, 2024
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
a34c0cf
initial structure change to add settings and search connectors
eedugon Sep 26, 2024
92a0f9b
enterprise search requirement added
eedugon Oct 1, 2024
3cd0a05
search connectors explained
eedugon Oct 1, 2024
5ea91b9
fixed image link
eedugon Oct 1, 2024
a766679
extra comma removed from json example
eedugon Oct 1, 2024
73905f4
ai settings moved to the end
eedugon Oct 1, 2024
8233af0
tbd content added
eedugon Oct 1, 2024
5c6ed67
Merge branch 'elastic:main' into ai_assistant_updates_816
eedugon Oct 1, 2024
a740003
screenshots deleted
eedugon Oct 2, 2024
0732d65
some you-cans removed
eedugon Oct 2, 2024
aaf16c2
connectors distinction included
eedugon Oct 2, 2024
04fdc59
minor updates
eedugon Oct 2, 2024
89f6518
AI Assistant icon added
eedugon Oct 3, 2024
82d810f
AI Assistant icon added
eedugon Oct 3, 2024
3f7fd0b
AI Assistant icon added
eedugon Oct 3, 2024
0ffac6f
search connectors setup added
eedugon Oct 4, 2024
35d471d
passive voice and to_do_this update
eedugon Oct 7, 2024
038a302
reindex method changes cancelled
eedugon Oct 7, 2024
24d49a6
Merge remote-tracking branch 'origin/main' into ai_assistant_updates_816
eedugon Oct 14, 2024
1e60fd5
override search connector indices list
eedugon Oct 14, 2024
29515e2
reindex method removed
eedugon Oct 14, 2024
af79e4d
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
3905865
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
0ca6296
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
b88954d
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
6935efa
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
8319872
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
8840c5c
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
26282b1
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
fb415da
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
ae67e44
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 15, 2024
873428b
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 16, 2024
110fcd1
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 16, 2024
30eb545
missing link added
eedugon Oct 16, 2024
b8d5f96
Update docs/en/observability/observability-ai-assistant.asciidoc
eedugon Oct 16, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file not shown.
1 change: 1 addition & 0 deletions docs/en/observability/images/icons/ai-assistant-bw.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions docs/en/observability/images/icons/ai-assistant.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
110 changes: 91 additions & 19 deletions docs/en/observability/observability-ai-assistant.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ The AI assistant requires the following:
** OpenAI `gpt-4`+.
** Azure OpenAI Service `gpt-4`(0613) or `gpt-4-32k`(0613) with API version `2023-07-01-preview` or more recent.
** AWS Bedrock, specifically the Anthropic Claude models.
* An {enterprise-search-ref}/server.html[Enterprise Search] server if {enterprise-search-ref}/connectors.html[search connectors] are used to populate external data into the knowledge base.
* The knowledge base requires a 4 GB {ml} node.

[IMPORTANT]
Expand Down Expand Up @@ -93,7 +94,7 @@ To set up the AI Assistant:
any knowledge base articles you created before 8.12 will have to be reindexed or upgraded before they can be used.
Knowledge base articles created before 8.12 use ELSER v1.
In 8.12, knowledge base articles must use ELSER v2.
You can either:
Options include:

* Clear all old knowledge base articles manually and reindex them.
* Upgrade all knowledge base articles indexed with ELSER v1 to ELSER v2 using a https://github.com/elastic/elasticsearch-labs/blob/main/notebooks/model-upgrades/upgrading-index-to-use-elser.ipynb[Python script].
Expand All @@ -103,13 +104,17 @@ The AI Assistant uses {ml-docs}/ml-nlp-elser.html[ELSER], Elastic's semantic sea

NOTE: Your AI provider may collect telemetry when using the AI Assistant. Contact your AI provider for information on how data is collected.

You can add information to the knowledge base by asking the AI Assistant to remember something while chatting (for example, "remember this for next time"). The assistant will create a summary of the information and add it to the knowledge base.
Data can be added to the knowledge base through the following methods:

You can also add external data to the knowledge base either in {kib} using the Stack Management UI or using the {es} Index API.
* <<obs-ai-kb-ui>> available at <<obs-ai-settings>> page.
* <<obs-ai-search-connectors>>
* <<obs-ai-reindex-api>> towards the knowledge base index directly (advanced method).

It's also possible to add information to the knowledge base by asking the AI Assistant to remember something while chatting (for example, "remember this for next time"). The assistant will create a summary of the information and add it to the knowledge base.
eedugon marked this conversation as resolved.
Show resolved Hide resolved

[discrete]
[[obs-ai-stack-management]]
=== Use the UI
[[obs-ai-kb-ui]]
=== Use the knowledge base UI

To add external data to the knowledge base in {kib}:

Expand All @@ -126,16 +131,71 @@ To add external data to the knowledge base in {kib}:
----
{
"id": "a_unique_human_readable_id",
"text": "Contents of item",
"text": "Contents of item"
}
----

[discrete]
[[obs-ai-index-api]]
=== Use the {es} Index API
[[obs-ai-search-connectors]]
=== Use search connectors

[TIP]
====
{enterprise-search-ref}/connectors.html[Search connectors] differ from the {kibana-ref}/action-types.html[Stack management -> Connectors] configured during the <<obs-ai-set-up, AI Assistant setup>>.
Search connectors are only needed when importing external data into the Knowledge base of the AI Assistant, while the stack connector to the LLM is required for the AI Assistant to work.
====

{enterprise-search-ref}/connectors.html[Connectors] allow you to index content from external sources thereby making it available for the AI Assistant. This can greatly improve the relevance of the AI Assistant’s responses. Data can be integrated from sources such as GitHub, Confluence, Google Drive, Jira, AWS S3, Microsoft Teams, Slack, and more.

These connectors are managed under *Search* -> *Content* -> *Connectors* in {kib}, they are outside of the {observability} Solution, and they require an {enterprise-search-ref}/server.html[Enterprise Search] server connected to the Elastic Stack.

By default, the AI Assistant queries all search connector indices. To customize which indices are queried, adjust the *Search connector index pattern* setting on the <<obs-ai-settings>> page. This allows precise control over which data sources are included in AI Assistant knowledge base.
eedugon marked this conversation as resolved.
Show resolved Hide resolved

To create a connector and make its content available to the AI Assistant knowledge base, do the following:
eedugon marked this conversation as resolved.
Show resolved Hide resolved

. In {kib} UI, go to *Search* -> *Content* -> *Connectors* and follow the instructions to create a new connector.
+
[NOTE]
====
If your {kib} Space doesn't include the `Search` solution you will have to create the connector from a different space or change your space *Solution view* setting to `Classic`.
====
+
For example, if you create a {enterprise-search-ref}/connectors-github.html[GitHub native connector] you will have to set a `name`, attach it to a new or existing `index`, add your `personal access token` and include the `list of repositories` to synchronize.
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
Learn more about configuring connectors on (TBD - links)
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
. Create a pipeline and process the data with ELSER
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
In order to create the embeddings needed by the AI Assistant (weights and tokens into a sparse vector field) we will have to create a *ML Inference Pipeline*:
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
.. Open the previously created connector and select the *Pipelines* tab.
.. Select *Copy and customize* button at the `Unlock your custom pipelines` box.
.. Select *Add Inference Pipeline* button at the `Machine Learning Inference Pipelines` box.
.. Select ELSER ML model to add the necessary embeddings to the data.
eedugon marked this conversation as resolved.
Show resolved Hide resolved
.. Select the fields that need to be evaluated as part of the inference pipeline.
.. Test and save the inference pipeline and the overall pipeline.
. Sync the data
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
Once the pipeline is set up, perform a *Full Content Sync* in the connector. The inference pipeline will process the data as follows:
+
* As data comes in, ELSER is applied to the data, and embeddings (weights and tokens into a sparse vector field) are added to capture semantic meaning and context of the data.
* When you look at the documents that were ingested, you will see how the weights and token are added to the `predicted_value` field in the documents.
eedugon marked this conversation as resolved.
Show resolved Hide resolved
. Check if AI Assistant can use the index (optional)
eedugon marked this conversation as resolved.
Show resolved Hide resolved
+
Ask something to the AI Assistant related with the indexed data.

[discrete]
[[obs-ai-reindex-api]]
=== Use the {es} Reindex API

TBD: quick introduction text about this method - Use the **{es} Reindex API** from existing indices towards the knowledge base index (advanced method).

// EEDUGON note: is this section still worthy when we have the option of search connectors?
// in the dest index below we are pointing directly to an ILM managed index that may not be valid. We should point to an existing ILM rollover alias or a data stream (TBD)
// IMO this is too advanced to appear here. Maybe a different type of guide.

. Ingest external data (GitHub issues, Markdown files, Jira tickets, text files, etc.) into {es} using the {es} {ref}/docs-index_.html[Index API].
. Reindex your data into the AI Assistant's knowledge base index by completing the following query in *Management* -> *Dev Tools* in {kib}. Update the following fields before reindexing:
. Reindex the previously indexed data into the AI Assistant's knowledge base index by completing the following query in *Management* -> *Dev Tools* in {kib}. Update the following fields before reindexing:
** `InternalDocsIndex` — name of the index where your internal documents are stored.
** `text_field` — name of the field containing your internal documents' text.
** `timestamp` — name of the timestamp field in your internal documents.
Expand Down Expand Up @@ -173,7 +233,7 @@ POST _reindex
[[obs-ai-interact]]
== Interact with the AI Assistant

You can chat with the AI Assistant or interact with contextual insights located throughout {observability}.
Chat with the AI Assistant or interact with contextual insights located throughout {observability}.
See the following sections for more on interacting with the AI Assistant.
eedugon marked this conversation as resolved.
Show resolved Hide resolved

TIP: After every answer the LLM provides, let us know if the answer was helpful.
Expand All @@ -183,10 +243,7 @@ Your feedback helps us improve the AI Assistant!
[[obs-ai-chat]]
=== Chat with the assistant

Click *AI Assistant* in the upper-right corner of any {observability} application to start the chat:

[role="screenshot"]
image::images/ai-assistant-button.png[Observability AI assistant preview]
Select the *AI Assistant* icon (image:images/icons/ai-assistant.svg[AI Assistant icon]) at the upper-right corner of any {observability} application to start the chat:

This opens the AI Assistant flyout, where you can ask the assistant questions about your instance:

Expand All @@ -208,7 +265,7 @@ beta::[]

The AI Assistant uses functions to include relevant context in the chat conversation through text, data, and visual components. Both you and the AI Assistant can suggest functions. You can also edit the AI Assistant's function suggestions and inspect function responses.

You can suggest the following functions:
Main functions:

[horizontal]
`alerts`:: Get alerts for {observability}.
Expand Down Expand Up @@ -250,13 +307,13 @@ Clicking a prompt generates a message specific to that log entry:
[role="screenshot"]
image::images/obs-ai-logs.gif[Observability AI assistant example, 75%]

You can continue a conversation from a contextual prompt by clicking *Start chat* to open the AI Assistant chat.
Continue a conversation from a contextual prompt by clicking *Start chat* to open the AI Assistant chat.

[discrete]
[[obs-ai-connector]]
=== Add the AI Assistant connector to alerting workflows

You can use the {kibana-ref}/obs-ai-assistant-action-type.html[Observability AI Assistant connector] to add AI-generated insights and custom actions to your alerting workflows.
Use the {kibana-ref}/obs-ai-assistant-action-type.html[Observability AI Assistant connector] to add AI-generated insights and custom actions to your alerting workflows as follows:
To do this:
eedugon marked this conversation as resolved.
Show resolved Hide resolved

. <<create-alerts-rules,Create (or edit) an alerting rule>> and specify the conditions that must be met for the alert to fire.
Expand All @@ -274,7 +331,7 @@ and also include other active alerts that may be related.
As a last step, you can ask the assistant to trigger an action,
such as sending the report (or any other message) to a Slack webhook.

NOTE: Currently you can only send messages to Slack, email, Jira, PagerDuty, or a webhook.
NOTE: Currently only Slack, email, Jira, PagerDuty, or webhook actions are supported.
Additional actions will be added in the future.

When the alert fires, contextual details about the event—such as when the alert fired,
Expand Down Expand Up @@ -307,6 +364,21 @@ The Observability AI Assistant connector is called when the alert fires and when

To learn more about alerting, actions, and connectors, refer to <<create-alerts>>.

[discrete]
[[obs-ai-settings]]
== AI Assistant Settings

Access the AI Assistant Settings page:
eedugon marked this conversation as resolved.
Show resolved Hide resolved

* From *{stack-manage-app}* -> *Kibana* -> *AI Assistants* -> *Elastic AI Assistant for Observability*.
* From the *More ations* menu inside the AI Assistant window.
eedugon marked this conversation as resolved.
Show resolved Hide resolved

The AI Assistant Settings page contains the following tabs:

* *Settings*: configure the main AI Assistant settings, which are explained directly within the interface.
* *Knowledge base*: <<obs-ai-kb-ui,manage knowledge base entries>>.
* *Search Connectors*: provides a link to {kib} *Search* -> *Content* -> *Connectors* UI for connectors configuration.

eedugon marked this conversation as resolved.
Show resolved Hide resolved
[discrete]
[[obs-ai-known-issues]]
== Known issues
Expand All @@ -318,5 +390,5 @@ To learn more about alerting, actions, and connectors, refer to <<create-alerts>
Most LLMs have a set number of tokens they can manage in single a conversation.
When you reach the token limit, the LLM will throw an error, and Elastic will display a "Token limit reached" error in Kibana.
The exact number of tokens that the LLM can support depends on the LLM provider and model you're using.
If you are using an OpenAI connector, you can monitor token usage in **OpenAI Token Usage** dashboard.
If you are using an OpenAI connector, monitor token utilization in **OpenAI Token Usage** dashboard.
eedugon marked this conversation as resolved.
Show resolved Hide resolved
For more information, refer to the {kibana-ref}/openai-action-type.html#openai-connector-token-dashboard[OpenAI Connector documentation].