Skip to content

Commit

Permalink
Merge pull request newrelic#17758 from newrelic/nvidia-nim-ai-monitoring
Browse files Browse the repository at this point in the history
Update intro AI monitoring copy to include a section about NVIDIA NIM
  • Loading branch information
akristen authored Jun 27, 2024
2 parents bf6adcf + 2bed66c commit 3875a0d
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,12 @@ AI monitoring is compatible with these agent versions and AI libraries:
</tbody>
</table>

## Monitoring at scale with NVIDIA NIM [#deploy-at-scale]

AI monitoring can integrate with and collect data about any models supported by NVIDIA NIM. For example, if you've built a Python or Node.js AI app that uses llama3, mistralai, or one of NVIDIA's proprietary LLMs, you can instrument those apps with AI monitoring and view performance data about your apps.

No additional steps are needed to integrate with NVIDIA NIM: you can follow our [manual procedures for installation](/install/ai-monitoring), or install [directly through the New Relic platform](https://onenr.io/0VRVNLqavRa).

## What's next? [#whats-next]

* You can get started by [installing AI monitoring](/install/ai-monitoring).
Expand Down
6 changes: 3 additions & 3 deletions src/content/docs/ai-monitoring/intro-to-ai-monitoring.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,9 @@ Enabling AI monitoring allows the agent to recognize AI metadata associated with

AI monitoring can help you answer critical questions about AI app performance: are your end users waiting too long for a response? Is there a recent spike in token usage? Are there patterns of negative user feedback around certain topics? With AI monitoring, you can see data specific to the AI-layer:

* [Identify errors in specific prompt and response interactions](/docs/ai-monitoring/view-ai-data/#ai-responses) from the response table. If an error occurs, open the [trace waterfall view](/docs/ai-monitoring/view-ai-data/#ai-response-trace-view) to scope to the methods and calls your AI-powered app makes when generating a response.
* If your prompt engineers updated prompt parameters for your AI, you can [track whether token usage spiked or dropped after the update](/docs/ai-monitoring/view-ai-data). Use AI monitoring to help you make decisions that keep costs down.
* Maybe you're fine tuning your app in development, but you want cost and performance efficiency before it goes to production. If you're using different models in different app environments, you can [compare the cost and performance of your apps before deploying](/docs/ai-monitoring/view-ai-data/#model-comparison).
* [Identify errors in specific prompt and response interactions](/docs/ai-monitoring/explore-ai-data/view-ai-responses) from the response table. If you're looking to make improvements to your AI models, [learn how to analyze your model data in New Relic](/docs/ai-monitoring/explore-ai-data/view-model-data).
* If you're using different models across app environments, you can [compare the cost and performance of your apps before deploying](/docs/ai-monitoring/view-ai-data/#model-comparison).
* Are you concerned about data compliance? [Learn how to create drop filters](/docs/ai-monitoring/drop-sensitive-data) to drop sensitive data before you send it to New Relic.

## Get started with AI monitoring [#get-started]

Expand Down

0 comments on commit 3875a0d

Please sign in to comment.