!9322 Add Posenet Demo README

From: @liuxiao78
Reviewed-by: @zhang_xue_tong,@zhanghaibo5
Signed-off-by: @zhang_xue_tong
pull/9322/MERGE
mindspore-ci-bot 4 years ago committed by Gitee
commit 6258f7f19a

@ -1,13 +1,12 @@
# Demo of Image Classification ## Demo of Image Classification
The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) and MindSpore Lite image classification models to perform on-device inference, classify the content captured by a device camera, and display the most possible classification result on the application's image preview screen. The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) and MindSpore Lite image classification models to perform on-device inference, classify the content captured by a device camera, and display the most possible classification result on the application's image preview screen.
### Running Dependencies ### Running Dependencies
- Android Studio 3.2 or later (Android 4.0 or later is recommended.) - Android Studio 3.2 or later (Android 4.0 or later is recommended.)
- Native development kit (NDK) 21.3 - Native development kit (NDK) 21.3
- CMake 3.10.2 [CMake](https://cmake.org/download) - CMake 3.10.2 [CMake](https://cmake.org/download)
- Android software development kit (SDK) 26 or later - Android software development kit (SDK) 26 or later
- JDK 1.8 or later - JDK 1.8 or later
@ -21,9 +20,7 @@ The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) an
![start_sdk](images/sdk_management.png) ![start_sdk](images/sdk_management.png)
(Optional) If an NDK version issue occurs during the installation, manually download the corresponding [NDK version](https://developer.android.com/ndk/downloads) (the version used in the sample code is 21.3). Specify the SDK location in `Android NDK location` of `Project Structure`. If you have any Android Studio configuration problem when trying this demo, please refer to item 5 to resolve it.
![project_structure](images/project_structure.png)
2. Connect to an Android device and runs the image classification application. 2. Connect to an Android device and runs the image classification application.
@ -39,13 +36,24 @@ The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) an
![result](images/app_result.jpg) ![result](images/app_result.jpg)
4. The solutions of Android Studio configuration problems:
| | Warning | Solution |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | Specify the installed ndk directory in local.propertiesndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | Manually download corresponding [NDK Version](https://developer.android.com/ndk/downloads)and specify the sdk directory in Project Structure - Android NDK location.You can refer to the figure below. |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | Update Android Studio Version in Tools - help - Checkout for Updates. |
| 4 | SSL peer shut down incorrectly | Run this demo again. |
![project_structure](images/project_structure.png)
## Detailed Description of the Sample Program ## Detailed Description of the Sample Program
This image classification sample program on the Android device includes a Java layer and a JNI layer. At the Java layer, the Android Camera 2 API is used to enable a camera to obtain image frames and process images. At the JNI layer, the model inference process is completed in [Runtime](https://www.mindspore.cn/tutorial/lite/en/master/use/runtime.html). This image classification sample program on the Android device includes a Java layer and a JNI layer. At the Java layer, the Android Camera 2 API is used to enable a camera to obtain image frames and process images. At the JNI layer, the model inference process is completed in [Runtime](https://www.mindspore.cn/tutorial/lite/en/master/use/runtime.html).
### Sample Program Structure ### Sample Program Structure
``` ```text
app app
├── src/main ├── src/main
@ -58,12 +66,12 @@ app
│ | └── MindSporeNetnative.h # header file │ | └── MindSporeNetnative.h # header file
│ | │ |
│ ├── java # application code at the Java layer │ ├── java # application code at the Java layer
│ │ └── com.mindspore.himindsporedemo │ │ └── com.mindspore.himindsporedemo
│ │ ├── gallery.classify # implementation related to image processing and MindSpore JNI calling │ │ ├── gallery.classify # implementation related to image processing and MindSpore JNI calling
│ │ │ └── ... │ │ │ └── ...
│ │ └── widget # implementation related to camera enabling and drawing │ │ └── widget # implementation related to camera enabling and drawing
│ │ └── ... │ │ └── ...
│ │ │ │
│ ├── res # resource files related to Android │ ├── res # resource files related to Android
│ └── AndroidManifest.xml # Android configuration file │ └── AndroidManifest.xml # Android configuration file
@ -84,7 +92,7 @@ Note: if the automatic download fails, please manually download the relevant lib
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -93,7 +101,7 @@ android{
} }
} }
ndk{ ndk{
abiFilters'armeabi-v7a', 'arm64-v8a' abiFilters'armeabi-v7a', 'arm64-v8a'
} }
} }
@ -102,7 +110,7 @@ android{
Create a link to the `.so` library file in the `app/CMakeLists.txt` file: Create a link to the `.so` library file in the `app/CMakeLists.txt` file:
``` ```text
# ============== Set MindSpore Dependencies. ============= # ============== Set MindSpore Dependencies. =============
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include)
@ -120,7 +128,7 @@ set_target_properties(minddata-lite PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/lib/libminddata-lite.so) ${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/lib/libminddata-lite.so)
# --------------- MindSpore Lite set End. -------------------- # --------------- MindSpore Lite set End. --------------------
# Link target library. # Link target library.
target_link_libraries( target_link_libraries(
... ...
# --- mindspore --- # --- mindspore ---
@ -132,7 +140,7 @@ target_link_libraries(
### Downloading and Deploying a Model File ### Downloading and Deploying a Model File
In this example, the download.gradle File configuration auto download `mobilenetv2.ms `and placed in the 'app/libs/arm64-v8a' directory. In this example, the download.gradle File configuration auto download `mobilenetv2.ms`and placed in the 'app/libs/arm64-v8a' directory.
Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location. Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
@ -142,11 +150,11 @@ mobilenetv2.ms [mobilenetv2.ms]( https://download.mindspore.cn/model_zoo/officia
Call MindSpore Lite C++ APIs at the JNI layer to implement on-device inference. Call MindSpore Lite C++ APIs at the JNI layer to implement on-device inference.
The inference code process is as follows. For details about the complete code, see `src/cpp/MindSporeNetnative.cpp`. The inference code process is as follows. For details about the complete code, see `src/cpp/MindSporeNetnative.cpp`.
1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference. 1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference.
- Load a model file. Create and configure the context for model inference. - Load a model file. Create and configure the context for model inference.
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
@ -154,24 +162,24 @@ The inference code process is as follows. For details about the complete code, s
char *modelBuffer = CreateLocalModelBuffer(env, buffer); char *modelBuffer = CreateLocalModelBuffer(env, buffer);
``` ```
- Create a session. - Create a session.
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork; MSNetWork *labelNet = new MSNetWork;
*labelEnv = labelNet; *labelEnv = labelNet;
// Create context. // Create context.
mindspore::lite::Context *context = new mindspore::lite::Context; mindspore::lite::Context *context = new mindspore::lite::Context;
context->thread_num_ = num_thread; context->thread_num_ = num_thread;
// Create the mindspore session. // Create the mindspore session.
labelNet->CreateSessionMS(modelBuffer, bufferLen, "device label", context); labelNet->CreateSessionMS(modelBuffer, bufferLen, "device label", context);
delete(context); delete(context);
``` ```
- Load the model file and build a computational graph for inference. - Load the model file and build a computational graph for inference.
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
@ -183,7 +191,7 @@ The inference code process is as follows. For details about the complete code, s
} }
``` ```
2. Convert the input image into the Tensor format of the MindSpore model. 2. Convert the input image into the Tensor format of the MindSpore model.
Convert the image data to be detected into the Tensor format of the MindSpore model. Convert the image data to be detected into the Tensor format of the MindSpore model.
@ -230,9 +238,9 @@ The inference code process is as follows. For details about the complete code, s
inputDims.channel * inputDims.width * inputDims.height * sizeof(float)); inputDims.channel * inputDims.width * inputDims.height * sizeof(float));
``` ```
3. Perform inference on the input tensor based on the model, obtain the output tensor, and perform post-processing. 3. Perform inference on the input tensor based on the model, obtain the output tensor, and perform post-processing.
- Perform graph execution and on-device inference. - Perform graph execution and on-device inference.
```cpp ```cpp
// After the model and image tensor data is loaded, run inference. // After the model and image tensor data is loaded, run inference.
@ -305,5 +313,5 @@ The inference code process is as follows. For details about the complete code, s
} }
return categoryScore; return categoryScore;
} }
``` ```

@ -2,18 +2,17 @@
本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 图像分类模型完成端侧推理实现对设备摄像头捕获的内容进行分类并在App图像预览界面中显示出最可能的分类结果。 本示例程序演示了如何在端侧利用MindSpore Lite C++ APIAndroid JNI以及MindSpore Lite 图像分类模型完成端侧推理实现对设备摄像头捕获的内容进行分类并在App图像预览界面中显示出最可能的分类结果。
### 运行依赖 ### 运行依赖
- Android Studio >= 3.2 (推荐4.0以上版本) - Android Studio >= 3.2 (推荐4.0以上版本)
- NDK 21.3 - NDK 21.3
- CMake 3.10.2 [CMake](https://cmake.org/download) - CMake 3.10.2 [CMake](https://cmake.org/download)
- Android SDK >= 26 - Android SDK >= 26
- JDK >= 1.8 - JDK >= 1.8
### 构建与运行 ### 构建与运行
1. 在Android Studio中加载本示例源码并安装相应的SDK指定SDK版本后由Android Studio自动安装 1. 在Android Studio中加载本示例源码并安装相应的SDK指定SDK版本后由Android Studio自动安装
![start_home](images/home.png) ![start_home](images/home.png)
@ -21,15 +20,13 @@
![start_sdk](images/sdk_management.png) ![start_sdk](images/sdk_management.png)
可选若安装时出现NDK版本问题可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)本示例代码使用的NDK版本为21.3),并在`Project Structure`的`Android NDK location`设置中指定SDK的位置。 使用过程中若出现Android Studio配置问题可参考第5项解决。
![project_structure](images/project_structure.png)
2. 连接Android设备运行图像分类应用程序。 2. 连接Android设备运行图像分类应用程序。
通过USB连接Android设备调试点击`Run 'app'`即可在您的设备上运行本示例项目。 通过USB连接Android设备调试点击`Run 'app'`即可在您的设备上运行本示例项目。
* 注:编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。 > 编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。
![run_app](images/run_app.PNG) ![run_app](images/run_app.PNG)
@ -45,6 +42,16 @@
![result](images/app_result.jpg) ![result](images/app_result.jpg)
4. Android Studio 配置问题解决方案可参考下表:
| | 报错 | 解决方案 |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| 1 | Gradle sync failed: NDK not configured. | 在local.properties中指定安装的ndk目录ndk.dir={ndk的安装目录} |
| 2 | Requested NDK version did not match the version requested by ndk.dir | 可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)并在Project Structure - Android NDK location设置中指定SDK的位置可参考下图完成 |
| 3 | This version of Android Studio cannot open this project, please retry with Android Studio or newer. | 在工具栏-help-Checkout for Updates中更新版本 |
| 4 | SSL peer shut down incorrectly | 重新构建 |
![project_structure](images/project_structure.png)
## 示例程序详细说明 ## 示例程序详细说明
@ -54,7 +61,7 @@
### 示例程序结构 ### 示例程序结构
``` ```text
app app
├── src/main ├── src/main
│ ├── assets # 资源文件 │ ├── assets # 资源文件
@ -68,12 +75,12 @@ app
| | └── MsNetWork.cpp # MindSpre接口封装 | | └── MsNetWork.cpp # MindSpre接口封装
│ | │ |
│ ├── java # java层应用代码 │ ├── java # java层应用代码
│ │ └── com.mindspore.himindsporedemo │ │ └── com.mindspore.himindsporedemo
│ │ ├── gallery.classify # 图像处理及MindSpore JNI调用相关实现 │ │ ├── gallery.classify # 图像处理及MindSpore JNI调用相关实现
│ │ │ └── ... │ │ │ └── ...
│ │ └── widget # 开启摄像头及绘制相关实现 │ │ └── widget # 开启摄像头及绘制相关实现
│ │ └── ... │ │ └── ...
│ │ │ │
│ ├── res # 存放Android相关的资源文件 │ ├── res # 存放Android相关的资源文件
│ └── AndroidManifest.xml # Android配置文件 │ └── AndroidManifest.xml # Android配置文件
@ -96,13 +103,13 @@ Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通
本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。 本示例中build过程由download.gradle文件自动下载MindSpore Lite 版本文件,并放置在`app/src/main/cpp/`目录下。
* 注:若自动下载失败,请手动下载相关库文件,解压并放在对应位置: > 若自动下载失败,请手动下载相关库文件,解压并放在对应位置:
mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz) mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz [下载链接](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.0.1/lite/android_aarch64/mindspore-lite-1.0.1-runtime-arm64-cpu.tar.gz)
在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示: 在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`的编译支持,如下所示:
``` ```text
android{ android{
defaultConfig{ defaultConfig{
externalNativeBuild{ externalNativeBuild{
@ -111,7 +118,7 @@ android{
} }
} }
ndk{ ndk{
abiFilters 'arm64-v8a' abiFilters 'arm64-v8a'
} }
} }
@ -120,7 +127,7 @@ android{
在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。 在`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。
``` ```text
# ============== Set MindSpore Dependencies. ============= # ============== Set MindSpore Dependencies. =============
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include) include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/third_party/flatbuffers/include)
@ -138,7 +145,7 @@ set_target_properties(minddata-lite PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/lib/libminddata-lite.so) ${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/lib/libminddata-lite.so)
# --------------- MindSpore Lite set End. -------------------- # --------------- MindSpore Lite set End. --------------------
# Link target library. # Link target library.
target_link_libraries( target_link_libraries(
... ...
# --- mindspore --- # --- mindspore ---
@ -152,41 +159,43 @@ target_link_libraries(
从MindSpore Model Hub中下载模型文件本示例程序中使用的终端图像分类模型文件为`mobilenetv2.ms`同样通过download.gradle脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。 从MindSpore Model Hub中下载模型文件本示例程序中使用的终端图像分类模型文件为`mobilenetv2.ms`同样通过download.gradle脚本在APP构建时自动下载并放置在`app/src/main/assets`工程目录下。
* 注若下载失败请手动下载模型文件mobilenetv2.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)。 > 若下载失败请手动下载模型文件mobilenetv2.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)。
### 编写端侧推理代码 ### 编写端侧推理代码
在JNI层调用MindSpore Lite C++ API实现端测推理。 在JNI层调用MindSpore Lite C++ API实现端测推理。
推理代码流程如下,完整代码请参见`src/cpp/MindSporeNetnative.cpp`。 推理代码流程如下,完整代码请参见`src/cpp/MindSporeNetnative.cpp`。
1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。 1. 加载MindSpore Lite模型文件构建上下文、会话以及用于推理的计算图。
- 加载模型文件:创建并配置用于模型推理的上下文 - 加载模型文件:创建并配置用于模型推理的上下文
```cpp ```cpp
// Buffer is the model data passed in by the Java layer // Buffer is the model data passed in by the Java layer
jlong bufferLen = env->GetDirectBufferCapacity(buffer); jlong bufferLen = env->GetDirectBufferCapacity(buffer);
char *modelBuffer = CreateLocalModelBuffer(env, buffer); char *modelBuffer = CreateLocalModelBuffer(env, buffer);
``` ```
- 创建会话 - 创建会话
```cpp ```cpp
void **labelEnv = new void *; void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork; MSNetWork *labelNet = new MSNetWork;
*labelEnv = labelNet; *labelEnv = labelNet;
// Create context. // Create context.
lite::Context *context = new lite::Context; lite::Context *context = new lite::Context;
context->thread_num_ = numThread; //Specify the number of threads to run inference context->thread_num_ = numThread; //Specify the number of threads to run inference
// Create the mindspore session. // Create the mindspore session.
labelNet->CreateSessionMS(modelBuffer, bufferLen, context); labelNet->CreateSessionMS(modelBuffer, bufferLen, context);
delete(context); delete(context);
``` ```
- 加载模型文件并构建用于推理的计算图 - 加载模型文件并构建用于推理的计算图
```cpp ```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx) void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
{ {
@ -196,11 +205,11 @@ target_link_libraries(
int ret = session->CompileGraph(model); int ret = session->CompileGraph(model);
} }
``` ```
2. 将输入图片转换为传入MindSpore模型的Tensor格式。 2. 将输入图片转换为传入MindSpore模型的Tensor格式。
将待检测图片数据转换为输入MindSpore模型的Tensor。 将待检测图片数据转换为输入MindSpore模型的Tensor。
```cpp ```cpp
if (!BitmapToLiteMat(env, srcBitmap, &lite_mat_bgr)) { if (!BitmapToLiteMat(env, srcBitmap, &lite_mat_bgr)) {
MS_PRINT("BitmapToLiteMat error"); MS_PRINT("BitmapToLiteMat error");
@ -243,8 +252,8 @@ target_link_libraries(
memcpy(inTensor->MutableData(), dataHWC, memcpy(inTensor->MutableData(), dataHWC,
inputDims.channel * inputDims.width * inputDims.height * sizeof(float)); inputDims.channel * inputDims.width * inputDims.height * sizeof(float));
``` ```
3. 对输入Tensor按照模型进行推理获取输出Tensor并进行后处理。 3. 对输入Tensor按照模型进行推理获取输出Tensor并进行后处理。
- 图执行,端测推理。 - 图执行,端测推理。
@ -254,6 +263,7 @@ target_link_libraries(
``` ```
- 获取输出数据。 - 获取输出数据。
```cpp ```cpp
auto names = mSession->GetOutputTensorNames(); auto names = mSession->GetOutputTensorNames();
std::unordered_map<std::string,mindspore::tensor::MSTensor *> msOutputs; std::unordered_map<std::string,mindspore::tensor::MSTensor *> msOutputs;
@ -264,8 +274,9 @@ target_link_libraries(
std::string resultStr = ProcessRunnetResult(::RET_CATEGORY_SUM, std::string resultStr = ProcessRunnetResult(::RET_CATEGORY_SUM,
::labels_name_map, msOutputs); ::labels_name_map, msOutputs);
``` ```
- 输出数据的后续处理。 - 输出数据的后续处理。
```cpp ```cpp
std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[], std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[],
std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) { std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) {
@ -318,5 +329,5 @@ target_link_libraries(
} }
return categoryScore; return categoryScore;
} }
``` ```

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 456 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 86 KiB

Loading…
Cancel
Save