The Inference demo describes the app integration process on NPU using the SqueezeNet under Caffe as an example.
- Introduction
- [Preparation && Getting Started](#Preparation && Getting Started)
- [Supported Environments](#Supported Environments)
- Result
- License
The app integration process on NPU includes model pre-processing, model loading, model inference, and model post-processing.
- Check whether the Android studio development is ready. To build this demo, please first import the demo in the Android Studio.
- Obtains the HiAI DDK version to determine whether the NPU is supported.
- Determines whether an offline model can run on the current HiAI version.
- If the offline model is incompatible with the HiAI DDK, try version compatibility by using the mixed-model online building API.
- Before using a model, you need to load the model. The DDK supports both single-model and multi-model loading. In sync mode, the app layer loads the model by calling the loadModelSync function at the JNI layer. In async mode, the app layer loads the model by calling the loadModelAsync function at the JNI layer.
- After the model is loaded, you can execute model inference. In sync mode, the app layer starts model inference by calling the runModelSync function at the JNI layer. In async mode, the app layer starts model inference by calling the runModelAsync function at the JNI layer.
- After inference, the model inference result is returned to the app layer. In sync mode, the app layer obtains the inference result by calling the runModelSync function and implements model post-processing by calling the postProcess function. In async mode, the app layer obtains the inference result by calling the OnProcessDone function and implements model post-processing by calling the postProcess function.
- Run the sample on your Android device or emulator.
- Use Ubuntu 16.04, Windows 10, or macOS to install Android Studio 3.5.3 or later
- Build native code with NDK R14b or later
- The sample of HUAWEI HiAI Foundation has obtained the Apache 2.0 License.