Skip to content

Commit

Permalink
Update docs/en/serverless/ai-assistant/ai-assistant.mdx
Browse files Browse the repository at this point in the history
Co-authored-by: Mike Birnstiehl <[email protected]>
  • Loading branch information
dedemorton and mdbirnstiehl authored Jun 11, 2024
1 parent 9054e87 commit edaff27
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/en/serverless/ai-assistant/ai-assistant.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -301,6 +301,6 @@ Most LLMs have a set number of tokens they can manage in single a conversation.
When you reach the token limit, the LLM will throw an error, and Elastic will display a "Token limit reached" error.
The exact number of tokens that the LLM can support depends on the LLM provider and model you're using.
If you are using an OpenAI connector, you can monitor token usage in **OpenAI Token Usage** dashboard.
For more information, refer to the [OpenAI Connector documentation](((kibana-ref))/openai-action-type.html#openai-connector-token-dashboard)
For more information, refer to the [OpenAI Connector documentation](((kibana-ref))/openai-action-type.html#openai-connector-token-dashboard).

<span id="hello-world"></span>

0 comments on commit edaff27

Please sign in to comment.