!9322 Add Posenet Demo README

From: @liuxiao78
Reviewed-by: @zhang_xue_tong,@zhanghaibo5
Signed-off-by: @zhang_xue_tong
pull/9322/MERGE
mindspore-ci-bot 5 years ago committed by Gitee
commit 6258f7f19a

@ -1,8 +1,7 @@
# Demo of Image Classification ## Demo of Image Classification
The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) and MindSpore Lite image classification models to perform on-device inference, classify the content captured by a device camera, and display the most possible classification result on the application's image preview screen. The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) and MindSpore Lite image classification models to perform on-device inference, classify the content captured by a device camera, and display the most possible classification result on the application's image preview screen.
### Running Dependencies ### Running Dependencies
- Android Studio 3.2 or later (Android 4.0 or later is recommended.) - Android Studio 3.2 or later (Android 4.0 or later is recommended.)
@ -21,9 +20,7 @@ The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) an
![start_sdk](images/sdk_management.png) ![start_sdk](images/sdk_management.png)
(Optional) If an NDK version issue occurs during the installation, manually download the corresponding [NDK version](https://developer.android.com/ndk/downloads) (the version used in the sample code is 21.3). Specify the SDK location in `Android NDK location` of `Project Structure`. If you have any Android Studio configuration problem when trying this demo, please refer to item 5 to resolve it.
![project_structure](images/project_structure.png)
2. Connect to an Android device and runs the image classification application. 2. Connect to an Android device and runs the image classification application.
@ -39,13 +36,24 @@ The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) an
![result](images/app_result.jpg) ![result](images/app_result.jpg)
4. The solutions of Android Studio configuration problems:
| | Warning | Solution |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | Specify the installed ndk directory in local.propertiesndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | Manually download corresponding [NDK Version](https://developer.android.com/ndk/downloads)and specify the sdk directory in Project Structure - Android NDK location.You can refer to the figure below. |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | Update Android Studio Version in Tools - help - Checkout for Updates. |
| 4 | SSL peer shut down incorrectly | Run this demo again. |
![project_structure](images/project_structure.png)
## Detailed Description of the Sample Program ## Detailed Description of the Sample Program
This image classification sample program on the Android device includes a Java layer and a JNI layer. At the Java layer, the Android Camera 2 API is used to enable a camera to obtain image frames and process images. At the JNI layer, the model inference process is completed in [Runtime](https://www.mindspore.cn/tutorial/lite/en/master/use/runtime.html). This image classification sample program on the Android device includes a Java layer and a JNI layer. At the Java layer, the Android Camera 2 API is used to enable a camera to obtain image frames and process images. At the JNI layer, the model inference process is completed in [Runtime](https://www.mindspore.cn/tutorial/lite/en/master/use/runtime.html).
### Sample Program Structure ### Sample Program Structure
``` ```text
app app
├── src/main ├── src/main
@ -84,7 +92,7 @@ Note: if the automatic download fails, please manually download the relevant lib
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -102,7 +110,7 @@ android{
Create a link to the `.so` library file in the `app/CMakeLists.txt` file: Create a link to the `.so` library file in the `app/CMakeLists.txt` file:
``` ```text
# ============== Set MindSpore Dependencies. ============= # ============== Set MindSpore Dependencies. =============
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include)
@ -132,7 +140,7 @@ target_link_libraries(
### Downloading and Deploying a Model File ### Downloading and Deploying a Model File
In this example, the download.gradle File configuration auto download `mobilenetv2.ms `and placed in the 'app/libs/arm64-v8a' directory. In this example, the download.gradle File configuration auto download `mobilenetv2.ms`and placed in the 'app/libs/arm64-v8a' directory.
Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location. Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
@ -146,7 +154,7 @@ The inference code process is as follows. For details about the complete code, s
1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference. 1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference.
- Load a model file. Create and configure the context for model inference. - Load a model file. Create and configure the context for model inference.
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
@ -154,7 +162,7 @@ The inference code process is as follows. For details about the complete code, s
char *modelBuffer = CreateLocalModelBuffer(env, buffer); char *modelBuffer = CreateLocalModelBuffer(env, buffer);
``` ```
- Create a session. - Create a session.
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
@ -171,7 +179,7 @@ The inference code process is as follows. For details about the complete code, s
``` ```
- Load the model file and build a computational graph for inference. - Load the model file and build a computational graph for inference.
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
@ -232,7 +240,7 @@ The inference code process is as follows. For details about the complete code, s
3. Perform inference on the input tensor based on the model, obtain the output tensor, and perform post-processing. 3. Perform inference on the input tensor based on the model, obtain the output tensor, and perform post-processing.
- Perform graph execution and on-device inference. - Perform graph execution and on-device inference.
```cpp ```cpp
// After the model and image tensor data is loaded, run inference. // After the model and image tensor data is loaded, run inference.

@ -2,7 +2,6 @@
本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 图像分类模型完成端侧推理实现对设备摄像头捕获的内容进行分类并在App图像预览界面中显示出最可能的分类结果。 本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 图像分类模型完成端侧推理实现对设备摄像头捕获的内容进行分类并在App图像预览界面中显示出最可能的分类结果。
### 运行依赖 ### 运行依赖
- Android Studio >= 3.2 (推荐4.0以上版本) - Android Studio >= 3.2 (推荐4.0以上版本)
@ -21,15 +20,13 @@
![start_sdk](images/sdk_management.png) ![start_sdk](images/sdk_management.png)
可选若安装时出现NDK版本问题可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)本示例代码使用的NDK版本为21.3),并在`Project Structure`的`Android NDK location`设置中指定SDK的位置。 使用过程中若出现Android Studio配置问题可参考第5项解决。
![project_structure](images/project_structure.png)
2. 连接Android设备运行图像分类应用程序。 2. 连接Android设备运行图像分类应用程序。
通过USB连接Android设备调试点击`Run 'app'`即可在您的设备上运行本示例项目。 通过USB连接Android设备调试点击`Run 'app'`即可在您的设备上运行本示例项目。
* 注:编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。 > 编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。
![run_app](images/run_app.PNG) ![run_app](images/run_app.PNG)
@ -45,6 +42,16 @@
![result](images/app_result.jpg) ![result](images/app_result.jpg)
4. Android Studio 配置问题解决方案可参考下表:
| | 报错 | 解决方案 |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | 在local.properties中指定安装的ndk目录ndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | 可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)并在Project Structure - Android NDK location设置中指定SDK的位置可参考下图完成 |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | 在工具栏-help-Checkout for Updates中更新版本 |
| 4 | SSL peer shut down incorrectly | 重新构建 |
![project_structure](images/project_structure.png)
## 示例程序详细说明 ## 示例程序详细说明
@ -54,7 +61,7 @@
### 示例程序结构 ### 示例程序结构
``` ```text
app app
├── src/main ├── src/main
│ ├── assets # 资源文件 │ ├── assets # 资源文件
@ -96,13 +103,13 @@ Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通
本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。 本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。
* 注:若自动下载失败,请手动下载相关库文件,解压并放在对应位置: > 若自动下载失败,请手动下载相关库文件,解压并放在对应位置:
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示: 在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示:
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -120,7 +127,7 @@ android{
在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。 在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。
``` ```text
# ============== Set MindSpore Dependencies. ============= # ============== Set MindSpore Dependencies. =============
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include)
@ -152,8 +159,7 @@ target_link_libraries(
从MindSpore Model Hub中下载模型文件本示例程序中使用的终端图像分类模型文件为`mobilenetv2.ms`同样通过download.gradle脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。 从MindSpore Model Hub中下载模型文件本示例程序中使用的终端图像分类模型文件为`mobilenetv2.ms`同样通过download.gradle脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。
* 注若下载失败请手动下载模型文件mobilenetv2.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)。 > 若下载失败请手动下载模型文件mobilenetv2.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)。
### 编写端侧推理代码 ### 编写端侧推理代码
@ -164,6 +170,7 @@ target_link_libraries(
1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。 1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。
- 加载模型文件:创建并配置用于模型推理的上下文 - 加载模型文件:创建并配置用于模型推理的上下文
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
jlong bufferLen = env->GetDirectBufferCapacity(buffer); jlong bufferLen = env->GetDirectBufferCapacity(buffer);
@ -171,6 +178,7 @@ target_link_libraries(
``` ```
- 创建会话 - 创建会话
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork; MSNetWork *labelNet = new MSNetWork;
@ -187,6 +195,7 @@ target_link_libraries(
``` ```
- 加载模型文件并构建用于推理的计算图 - 加载模型文件并构建用于推理的计算图
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
{ {
@ -254,6 +263,7 @@ target_link_libraries(
``` ```
- 获取输出数据。 - 获取输出数据。
```cpp ```cpp
auto names = mSession->GetOutputTensorNames(); auto names = mSession->GetOutputTensorNames();
std::unordered_map<std::string,mindspore::tensor::MSTensor *> msOutputs; std::unordered_map<std::string,mindspore::tensor::MSTensor *> msOutputs;
@ -266,6 +276,7 @@ target_link_libraries(
``` ```
- 输出数据的后续处理。 - 输出数据的后续处理。
```cpp ```cpp
std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[], std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[],
std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) { std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) {

@ -20,9 +20,7 @@ The following section describes how to build and execute an on-device object det
![start_home](images/home.png) ![start_home](images/home.png)
Start Android Studio, click `File > Settings > System Settings > Android SDK`, and select the corresponding SDK. As shown in the following figure, select an SDK and click `OK`. Android Studio automatically installs the SDK. If you have any Android Studio configuration problem when trying this demo, please refer to item 5 to resolve it.
![start_sdk](images/sdk_management.png)
2. Connect to an Android device and runs the object detection application. 2. Connect to an Android device and runs the object detection application.
@ -36,6 +34,16 @@ The following section describes how to build and execute an on-device object det
![result](images/object_detection.png) ![result](images/object_detection.png)
4. The solutions of Android Studio configuration problems:
| | Warning | Solution |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | Specify the installed ndk directory in local.propertiesndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | Manually download corresponding [NDK Version](https://developer.android.com/ndk/downloads)and specify the sdk directory in Project Structure - Android NDK location.You can refer to the figure below. |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | Update Android Studio Version in Tools - help - Checkout for Updates. |
| 4 | SSL peer shut down incorrectly | Run this demo again. |
![project_structure](images/project_structure.png)
## Detailed Description of the Sample Program ## Detailed Description of the Sample Program
@ -51,7 +59,7 @@ Note: if the automatic download fails, please manually download the relevant lib
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -69,7 +77,7 @@ android{
Create a link to the `.so` library file in the `app/CMakeLists.txt` file: Create a link to the `.so` library file in the `app/CMakeLists.txt` file:
``` ```text
# Set MindSpore Lite Dependencies. # Set MindSpore Lite Dependencies.
set(MINDSPORELITE_VERSION mindspore-lite-1.0.1-runtime-arm64-cpu) set(MINDSPORELITE_VERSION mindspore-lite-1.0.1-runtime-arm64-cpu)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION})
@ -91,14 +99,12 @@ target_link_libraries(
### Downloading and Deploying a Model File ### Downloading and Deploying a Model File
In this example, the download.gradle File configuration auto download `ssd.ms `and placed in the 'app / libs / arm64-v8a' directory. In this example, the download.gradle File configuration auto download `ssd.ms`and placed in the 'app / libs / arm64-v8a' directory.
Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location. Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
ssd.ms [ssd.ms]( https://download.mindspore.cn/model_zoo/official/lite/ssd_mobilenetv2_lite/ssd.ms) ssd.ms [ssd.ms]( https://download.mindspore.cn/model_zoo/official/lite/ssd_mobilenetv2_lite/ssd.ms)
### Compiling On-Device Inference Code ### Compiling On-Device Inference Code
Call MindSpore Lite C++ APIs at the JNI layer to implement on-device inference. Call MindSpore Lite C++ APIs at the JNI layer to implement on-device inference.
@ -107,7 +113,7 @@ The inference code process is as follows. For details about the complete code, s
1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference. 1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference.
- Load a model file. Create and configure the context for model inference. - Load a model file. Create and configure the context for model inference.
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
@ -115,7 +121,7 @@ The inference code process is as follows. For details about the complete code, s
char *modelBuffer = CreateLocalModelBuffer(env, buffer); char *modelBuffer = CreateLocalModelBuffer(env, buffer);
``` ```
- Create a session. - Create a session.
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
@ -134,7 +140,7 @@ The inference code process is as follows. For details about the complete code, s
``` ```
- Load the model file and build a computational graph for inference. - Load the model file and build a computational graph for inference.
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
@ -546,4 +552,3 @@ The inference code process is as follows. For details about the complete code, s
return result; return result;
} }
``` ```

@ -2,7 +2,6 @@
本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 目标检测模型完成端侧推理实现对图库或者设备摄像头捕获的内容进行检测并在App图像预览界面中显示连续目标检测结果。 本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 目标检测模型完成端侧推理实现对图库或者设备摄像头捕获的内容进行检测并在App图像预览界面中显示连续目标检测结果。
### 运行依赖 ### 运行依赖
- Android Studio >= 3.2 (推荐4.0以上版本) - Android Studio >= 3.2 (推荐4.0以上版本)
@ -20,14 +19,12 @@
![start_sdk](images/sdk_management.png) ![start_sdk](images/sdk_management.png)
可选若安装时出现NDK版本问题可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)本示例代码使用的NDK版本为21.3),并在`Project Structure`的`Android NDK location`设置中指定SDK的位置。 使用过程中若出现Android Studio配置问题可参考第5项解决。
![project_structure](images/project_structure.png)
2. 连接Android设备运行目标检测示例应用程序。 2. 连接Android设备运行目标检测示例应用程序。
通过USB连接Android设备调试点击`Run 'app'`即可在你的设备上运行本示例项目。 通过USB连接Android设备调试点击`Run 'app'`即可在你的设备上运行本示例项目。
* 注:编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。 > 编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。
![run_app](images/run_app.PNG) ![run_app](images/run_app.PNG)
@ -41,6 +38,16 @@
![result](images/object_detection.png) ![result](images/object_detection.png)
4. Android Studio 配置问题解决方案可参考下表:
| | 报错 | 解决方案 |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | 在local.properties中指定安装的ndk目录ndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | 可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)并在Project Structure - Android NDK location设置中指定SDK的位置可参考下图完成 |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | 在工具栏-help-Checkout for Updates中更新版本 |
| 4 | SSL peer shut down incorrectly | 重新构建 |
![project_structure](images/project_structure.png)
## 示例程序详细说明 ## 示例程序详细说明
@ -50,7 +57,7 @@
### 示例程序结构 ### 示例程序结构
``` ```text
app app
| |
├── libs # 存放demo jni层编译出的库文件 ├── libs # 存放demo jni层编译出的库文件
@ -95,13 +102,13 @@ Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通
本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。 本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。
* 注:若自动下载失败,请手动下载相关库文件,解压并放在对应位置: > 若自动下载失败,请手动下载相关库文件,解压并放在对应位置:
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示: 在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示:
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -119,7 +126,7 @@ android{
在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。 在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。
``` ```text
# Set MindSpore Lite Dependencies. # Set MindSpore Lite Dependencies.
set(MINDSPORELITE_VERSION mindspore-lite-1.0.1-runtime-arm64-cpu) set(MINDSPORELITE_VERSION mindspore-lite-1.0.1-runtime-arm64-cpu)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION})
@ -143,8 +150,7 @@ target_link_libraries(
从MindSpore Model Hub中下载模型文件本示例程序中使用的目标检测模型文件为`ssd.ms`,同样通过`download.gradle`脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。 从MindSpore Model Hub中下载模型文件本示例程序中使用的目标检测模型文件为`ssd.ms`,同样通过`download.gradle`脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。
* 注若下载失败请手动下载模型文件ssd.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/ssd_mobilenetv2_lite/ssd.ms)。 > 若下载失败请手动下载模型文件ssd.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/ssd_mobilenetv2_lite/ssd.ms)。
### 编写端侧推理代码 ### 编写端侧推理代码
@ -155,6 +161,7 @@ target_link_libraries(
1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。 1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。
- 加载模型文件:创建并配置用于模型推理的上下文 - 加载模型文件:创建并配置用于模型推理的上下文
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
jlong bufferLen = env->GetDirectBufferCapacity(buffer); jlong bufferLen = env->GetDirectBufferCapacity(buffer);
@ -162,6 +169,7 @@ target_link_libraries(
``` ```
- 创建会话 - 创建会话
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork; MSNetWork *labelNet = new MSNetWork;
@ -180,6 +188,7 @@ target_link_libraries(
``` ```
- 加载模型文件并构建用于推理的计算图 - 加载模型文件并构建用于推理的计算图
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
{ {
@ -273,6 +282,7 @@ target_link_libraries(
``` ```
- 获取输出数据。 - 获取输出数据。
```cpp ```cpp
auto names = mSession->GetOutputTensorNames(); auto names = mSession->GetOutputTensorNames();
typedef std::unordered_map<std::string, typedef std::unordered_map<std::string,
@ -374,7 +384,7 @@ target_link_libraries(
} }
``` ```
- 通过最大值抑制将目标类型置信度较高的输出筛选出来。 - 通过最大值抑制将目标类型置信度较高的输出筛选出来。
```cpp ```cpp
void SSDModelUtil::nonMaximumSuppression(const YXBoxes *const decoded_boxes, void SSDModelUtil::nonMaximumSuppression(const YXBoxes *const decoded_boxes,
@ -496,5 +506,3 @@ target_link_libraries(
return result; return result;
} }
``` ```

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 456 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 86 KiB

Loading…
Cancel
Save