A repository demonstrating an end-to-end architecture for Intelligent Video Analytics using NVIDIA hardware with Microsoft Azure.
This project contains a collection of self-paced learning modules which guide the user in developing a custom Intelligent Video Analytics application that can handle a variety of video input sources, leverage a custom object detection model, and provide backing cloud services for analysis and reporting.
- Module 1 - Introduction to NVIDIA DeepStream
- Module 2 - Configure and Deploy "Intelligent Video Analytics" to IoT Edge Runtime on NVIDIA Jetson
- Module 3 - Develop and deploy Custom Object Detection Models with IoT Edge DeepStream SDK Module
- Module 4 - Filtering Telemetry with Azure Stream Analytics at the Edge and Modeling with Azure Time Series Insights
- Module 5 - Visualizing Object Detection Data in Near Real-Time with PowerBI
Each of these modules is accompanied by a LiveStream that walks through the steps to reproduce in full detail. You can watch a build out of the entire project from the ground up by checking out the following 5-part video playlist on Youtube.
The project makes use of the NVIDIA DeepStream SDK running on NVIDIA Jetson Embedded hardware to produce an Intelligent Video Analytics Pipeline.
The solution employs a number of modules that run on the NVIDIA hardware device which are instrumented using the Azure IoT Edge runtime. These modules include the Azure Blob Storage on IoT Edge Module for capturing and mirroring object detection training samples to the cloud via a paired Camera Tagging Module. These captured samples are then used to train a custom object detection model with the Custom Vision AI offering from Azure Cognitive Services. Models generated by this service are leveraged by the DeepStream SDK module using a Custom Yolo Parser.
As object detections are produced by the DeepStream SDK, they are filtered using an Azure Stream Analytics on Edge Job that transforms the output into summarized detections. These object detection results are then transmitted to an Azure IoT Hub where they can be forwarded to additional cloud services for processing and reporting.
The cloud services employed include Time Series Insights, which is a fully managed event processing service for analyzing data over time. We also demonstrate how to forward object detection data to a PowerBI dataset for live visualization of results within PowerBI Reports and Dashboards.
For more details on how this all works under the hood, check out this episode of the IoT Show where we cover these capabilities and associated services in depth:
Hardware:
- NVIDIA Jetson Embedded Device running JetPack 5.0.2
- A cooling fan installed on or pointed at the Nvidia Jetson Nano device
- RTSP Capable Camera (Optional)
- Note: We recommend the FI9821P from Foscam
- USB Webcam (Optional)
- Note: If using a Jetson Nano, the power consumption will require that your device is configured to use a 5V/4A barrel adapter as mentioned here with an Open-CV compatible camera.
Development Environment:
- Visual Studio Code (VSCode)
- Note: ARM64 builds of VSCode are not officially supported, however, it is possible to install and run the Development Tools on your NVIDIA Jetson Device. This is not recommended on Jetson Nano hardware due to resource limitations. Consult this article on Getting Started with IoT Edge Development on Nvidia Jetson Devices for more details.
- Visual Studio Code Extensions
- Git tool(s)
Git command line
Cloud Services:
- An Active Microsoft Azure Subscription
If you are interested in learning more about building solutions with Azure IoT Services, check out the following free learning resources:
Once you have upskilled as an IoT developer, make it official with the AZ-220 Azure IoT Developer certification.