Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create log data set monitoring docs #3960

Merged
merged 11 commits into from
Jul 3, 2024
Binary file added docs/en/serverless/images/green-dot-icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/en/serverless/images/red-dot-icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/en/serverless/images/yellow-dot-icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
7 changes: 7 additions & 0 deletions docs/en/serverless/logging/log-monitoring.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,13 @@ The following resources provide information on viewing and monitoring your logs:
- <DocLink slug="/serverless/observability/discover-and-explore-logs">Discover and explore</DocLink>: Discover and explore all of the log events flowing in from your servers, virtual machines, and containers in a centralized view.
- <DocLink slug="/serverless/observability/aiops-detect-anomalies">Detect log anomalies</DocLink>: Use ((ml)) to detect log anomalies automatically.

## Monitor datasets

The **Datasets** page provides an overview of your datasets and their quality.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
Use this information to get an idea of your overall dataset quality, and find datasets that contain incorrectly parsed documents.

<DocLink id="serverlessObservabilityMonitorDatasets">Monitor datasets</DocLink>

## Application logs

Application logs provide valuable insight into events that have occurred within your services and applications.
Expand Down
51 changes: 51 additions & 0 deletions docs/en/serverless/logging/monitor-datasets.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
id: serverlessObservabilityMonitorDatasets
slug: /serverless/observability/monitor-datasets
title: Monitor log data set quality
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
description: Monitor log data sets to find degraded documents.
tags: [ 'serverless', 'observability', 'how-to' ]
---

<p><DocBadge template="technical preview" /></p>
<p><DocBadge template="beta" /></p>

The **Data Set Quality** page provides an overview of your log data sets.
Use this information to get an idea of your overall log data set quality and find data sets that contain incorrectly parsed documents.
Access the Datasets page from the main ((kib)) menu and go to **Stack Management** → **Data Set Quality*.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved

{/* Update screenshot when changes are done <DocImage size="2" url="../images/logs-dataset-overview.png" alt="Screen capture of the data set overview" /> */}

<DocCallOut title="Requirements">
Users with the `viewer` role can see the Datasets Quality summary. To see the Active Datasets and Estimated Data summaries, users need the `monitor` [index privilege](((ref))/security-privileges.html#privileges-list-indices) for the `logs-*-*` index.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
</DocCallOut>

The quality of your data sets is based on the percentage of degraded documents in each data set.
A degraded document in a data set contains the [`_ignored`](((ref))/mapping-ignored-field.html) property because one or more of its fields were ignored during indexing.
Fields are ignored for a variety of reasons.
For example, when the [`ignore_malformed`](((ref))/mapping-ignored-field.html.html) parameter is set to true, if a document field contains the wrong data type, the malformed field is ignored and the rest of the document is indexed.

From the data set table, you'll find information for each data set such as its namespace, size, when the data set was last active, as well as the percentage of degraded docs.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
The percentage of degraded documents determines the data set's quality according to the following scale:

* Good (<DocImage flatImage alt="Good icon" url="../images/green-dot-icon.png" />): 0% of the documents in the data set are degraded.
* Degraded (<DocImage flatImage alt="Degraded icon" url="../images/yellow-dot-icon.png" />): Greater than 0% and up to 3% of the documents in the data set are degraded.
* Poor (<DocImage flatImage alt="Poor icon" url="../images/red-dot-icon.png" />): Greater than 3% of the documents in the data set are degraded.

Opening the details of a specific data set shows the degraded documents history, a summary for the data set, and other details that can help you determine if you need to investigate any issues.

### Investigate issues in individual data sets
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
To investigate issues in data sets with poor or degraded quality:

1. Find data sets with degraded documents using the **Degraded Docs** column of the data sets table.
1. Click the percentage in the **Degraded Docs** column to open the data set in Logs Explorer.

The **Documents** table in Logs Explorer is automatically filtered to show documents that were not parsed correctly.
Under the *actions* column, you'll see the degraded document icon (<DocIcon type="indexClose" title="degraded document icon" />).

Now that you know which documents contain ignored fields, examine them more closely to find the origin of the issue.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved

1. Under the **actions** column, click (<DocIcon type="expand" title="expand icon" />) to open the log details.
1. Select the **JSON** tab.
1. Scroll towards the end of the JSON to find the `ignored_field_values`.

Here, you'll find all of the `_ignored` fields in the document and their values, which should provide some clues as to why the fields were ignored.
8 changes: 7 additions & 1 deletion docs/en/serverless/logging/view-and-monitor-logs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -77,4 +77,10 @@ The following actions help you filter and focus on specific fields in the log de
* **Filter for value (<DocIcon type="plusInCircle" title="filter for value icon" />):** Show logs that contain the specific field value.
* **Filter out value (<DocIcon type="minusInCircle" title="filter out value icon" />):** Show logs that do _not_ contain the specific field value.
* **Filter for field present (<DocIcon type="filter" title="filter for present icon" />):** Show logs that contain the specific field.
* **Toggle column in table (<DocIcon type="listAdd" title="toggle column in table icon" />):** Add or remove a column for the field to the main Logs Explorer table.
* **Toggle column in table (<DocIcon type="listAdd" title="toggle column in table icon" />):** Add or remove a column for the field to the main Logs Explorer table.

## View log dataset details
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved

From the main ((kib)) menu, go to **Stack Management** → **Logs datasets* to view more details about your datasets and monitor their overall quality.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved

Refer to <DocLink id="serverlessObservabilityMonitorDatasets">Monitor datasets</DocLink> for more on the monitoring your datasets.
mdbirnstiehl marked this conversation as resolved.
Show resolved Hide resolved
5 changes: 5 additions & 0 deletions docs/en/serverless/serverless-observability.docnav.json
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,11 @@
"classic-sources": ["enObservabilityMonitorLogs"],
"classic-skip": true
},
{
"slug": "/serverless/observability/monitor-datasets",
"classic-sources": ["enObservabilityMonitorDatasets"],
"classic-skip": true
},
{
"slug": "/serverless/observability/run-log-pattern-analysis",
"classic-sources": ["enKibanaRunPatternAnalysisDiscover"]
Expand Down