Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(NR-320588) Adding some documentation for a new query limit #19118

Merged
merged 7 commits into from
Oct 30, 2024
38 changes: 36 additions & 2 deletions src/content/docs/alerts/admin/rules-limits-alerts.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,20 @@
</td>
</tr>

<tr>
<td>
Alerts query scan operations per minute, per account ([learn more](#query-scan-limit))
haus marked this conversation as resolved.
Show resolved Hide resolved
</td>

<td>
N/A
</td>

<td>
2,500,000,000
</td>
</tr>

<tr>
<td>
[Condition name](/docs/alerts/new-relic-alerts-beta/configuring-alert-policies/define-alert-conditions)
Expand Down Expand Up @@ -330,8 +344,8 @@
To understand what conditions are leading to the most throughput, you can perform a query like:

```sql
FROM NrAiSignal
SELECT sum(aggregatedDataPointsCount) AS 'alert matched data points'
FROM NrAiSignal
SELECT sum(aggregatedDataPointsCount) AS 'alert matched data points'
FACET conditionId
```

Expand All @@ -344,3 +358,23 @@
To request a limit increase, talk to your New Relic account representative.

Note that using [sliding windows](/docs/query-your-data/nrql-new-relic-query-language/nrql-query-tutorials/create-smoother-charts-sliding-windows) can significantly increase the number of data points. Consider using a longer duration of Sliding window aggregation to reduce the number of data points produced.

## Alerts query scan operations per minute [#query-scan-limit]

The alert condition `Alerts query scan operations per minute` limit applies to the total rate of query scan operations on ingested events.
A query scan operation is the work performed by the New Relic pipeline to match ingested events to alert queries registered in a New Relic [account](/docs/accounts/accounts-billing/account-structure/new-relic-account-structure).

If this limit is exceeded, you won't be able to create or update conditions for the impacted account until the rate goes below the limit. Existing alert conditions are **not** affected.

Check failure on line 367 in src/content/docs/alerts/admin/rules-limits-alerts.mdx

View workflow job for this annotation

GitHub Actions / vale-linter

[vale] reported by reviewdog 🐶 [Microsoft.Contractions] Use 'aren't' instead of 'are not'. Raw Output: {"message": "[Microsoft.Contractions] Use 'aren't' instead of 'are not'.", "location": {"path": "src/content/docs/alerts/admin/rules-limits-alerts.mdx", "range": {"start": {"line": 367, "column": 1}}}, "severity": "ERROR"}

Check warning on line 367 in src/content/docs/alerts/admin/rules-limits-alerts.mdx

View workflow job for this annotation

GitHub Actions / vale-linter

[vale] reported by reviewdog 🐶 [Microsoft.Passive] 'is exceeded' looks like passive voice. Raw Output: {"message": "[Microsoft.Passive] 'is exceeded' looks like passive voice.", "location": {"path": "src/content/docs/alerts/admin/rules-limits-alerts.mdx", "range": {"start": {"line": 367, "column": 15}}}, "severity": "INFO"}

You can see your query scan operations and any limit incidents in the [limits UI](/docs/data-apis/manage-data/view-system-limits).

When matching events to alert queries, all events from the [data type](/docs/nrql/get-started/introduction-nrql-new-relics-query-language/#what-you-can-query) that the query references must be examined. Here are a few common ways to have fewer events in a given data type (which will decrease the alert query scan operations):

Check warning on line 371 in src/content/docs/alerts/admin/rules-limits-alerts.mdx

View workflow job for this annotation

GitHub Actions / vale-linter

[vale] reported by reviewdog 🐶 [Microsoft.Passive] 'be examined' looks like passive voice. Raw Output: {"message": "[Microsoft.Passive] 'be examined' looks like passive voice.", "location": {"path": "src/content/docs/alerts/admin/rules-limits-alerts.mdx", "range": {"start": {"line": 371, "column": 191}}}, "severity": "INFO"}
* When alerting on Logs data, use [log partitions](/docs/tutorial-manage-large-log-volume/organize-large-logs/) to limit which logs are being scanned for alert queries.

Check warning on line 372 in src/content/docs/alerts/admin/rules-limits-alerts.mdx

View workflow job for this annotation

GitHub Actions / vale-linter

[vale] reported by reviewdog 🐶 [Microsoft.Passive] 'being scanned' looks like passive voice. Raw Output: {"message": "[Microsoft.Passive] 'being scanned' looks like passive voice.", "location": {"path": "src/content/docs/alerts/admin/rules-limits-alerts.mdx", "range": {"start": {"line": 372, "column": 137}}}, "severity": "INFO"}
haus marked this conversation as resolved.
Show resolved Hide resolved
* When alerting on Custom Events, break up larger custom event types
* Use Custom Events instead of alerting on Transaction events
* [Create metrics](/docs/data-apis/convert-to-metrics/create-metrics-other-data-types/) when possible
* Use [metric timeslice queries](/docs/data-apis/understand-data/metric-data/query-apm-metric-timeslice-data-nrql/) when possible instead of alerting on Transaction

In addition to the above tips, cleaning up any unused or unneeded alert queries (alert conditions) will decrease the number of query scan operations.

Check warning on line 378 in src/content/docs/alerts/admin/rules-limits-alerts.mdx

View workflow job for this annotation

GitHub Actions / vale-linter

[vale] reported by reviewdog 🐶 [Microsoft.Wordiness] Consider using 'also' instead of 'In addition'. Raw Output: {"message": "[Microsoft.Wordiness] Consider using 'also' instead of 'In addition'.", "location": {"path": "src/content/docs/alerts/admin/rules-limits-alerts.mdx", "range": {"start": {"line": 378, "column": 1}}}, "severity": "INFO"}

To request a limit increase, talk to your New Relic account representative.
Loading