Skip to content

Commit

Permalink
20240103 release (#2)
Browse files Browse the repository at this point in the history
Signed-off-by: Chen <[email protected]>
Co-authored-by: Chen <[email protected]>
  • Loading branch information
chxin66 and Chen authored Jan 3, 2024
1 parent 6c87f88 commit 71bb092
Show file tree
Hide file tree
Showing 33 changed files with 5,024 additions and 3,934 deletions.
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
/****************************************************************************
*
* Copyright (c) 2023 Vivante Corporation
* Copyright (c) 2024 Vivante Corporation
*
* Permission is hereby granted, free of charge, to any person obtaining a
* copy of this software and associated documentation files (the "Software"),
Expand Down
76 changes: 20 additions & 56 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,59 +1,30 @@
# VSI NPU Android Support Library

**NOTE**: For customer, please ignore any section with (VSI internal)

## How to build from distributed customer source package
## 1 How to build

```sh
cmake -B <build_dir> -S <SL_dir> -DCMAKE_TOOLCHAIN_FILE=<ndk_root>/build/cmake/android.toolchain.cmake -DANDROID_ABI=arm64-v8a -DANDROID_PLATFORM=android-34

cd <build_dir>
make tim-vx VsiSupportLibrary
# tim-vx MUST make before VsiSupportLibrary
```

## How to build from internal repo (VSI internal)

```sh
#Verified with android ndk r23c for verisilicon in house development
cmake -B <build_dir> -S <SL_dir> -DCMAKE_TOOLCHAIN_FILE=<ndk_root>/build/cmake/android.toolchain.cmake -DANDROID_ABI=arm64-v8a -DANDROID_PLATFORM=android-34 -DSLANG_TARGET_PID=<PID> -DSL_DIST_BUILD=OFF

cd <build_dir>
make tim-vx Slang VsiSupportLibrary
```

Reference for cmake variable in android toolchain: <https://developer.android.com/ndk/guides/cmake>

## Common problems

ld: error: undefined symbol: __android_log_print.

Append -DCMAKE_CXX_FLAGS="-llog" after cmake options may help.

## Switch git url for TIM-VX (VSI internal)

For customer, they can get latest tim-vx from github and this is the default behavior. For internal development, we can switch it back to internal gitlab by
-DPUBLIC_TIM_VX=OFF

## Switch git url for Slang (internal build)

When build in internal, we can switch Slang resuorce to internal url by
-DINTERNAL_BUILD=ON

## Integrate with Android
## 2 Integrate with Android

verified with i.MX 8M Plus and Android 14

### Precondition
### 2.1 Precondition

Since Android Support Library is a standalone library which implemented NNAPI spec, on android, we need to
wrap it as a service for applications(android cts). In this document, we take "shell" approach to wrap support library as a service.

Download android aosp or get it from SoC vendor.

### Apply patch when build shell service
### 2.2 Apply patch when build shell service

Apply patches in our SL `patches/`, if Android 12, use `patches_a12`:
Apply patches in our SL `patches/`, select the corresponding version:

```sh
cd ${AOSP_ROOT}/packages/modules/NeuralNetworks/
Expand All @@ -66,7 +37,7 @@ Why these patches are needed:
1. Build the shell service executable that can load our support library.
2. Use NNAPI validation utils to check whether a HAL model is conformed to NNAPI standard before converting the HAL model to SL model. Also check whether the HAL model contains OPs not supported by SL, if so, skip related VTS test cases.

### build shell service for VTS and CTS
### 2.3 build shell service for VTS and CTS

```sh
cd ${AOSP_ROOT}/packages/modules/NeuralNetworks/driver/sample_shim
Expand All @@ -75,14 +46,14 @@ mm -j8

The built shell service executable is located at `${AOSP_ROOT}/out/target/product/evk_8mp/symbols/vendor/bin/hw/android.hardware.neuralnetworks-shell-service-sample`.

### Run test
### 2.4 Run test

push libtim-vx.so libVsiSupportLibrary.so libneuralnetworks.so VtsHalNeuralnetworksTargetTest CtsNNAPITestCases64 android.hardware.neuralnetworks-shell-service-sample to board
push libtim-vx.so libVsiSupportLibrary.so VtsHalNeuralnetworksTargetTest CtsNNAPITestCases64 android.hardware.neuralnetworks-shell-service-sample to board
You can get android test suit in <https://source.android.com/>

#### 1. Delete old service and add shell service to vintf manifest
#### 2.4.1 Delete old service(optional) and add shell service to vintf manifest

delete old service:
If you have old service, delete old service:
cd /vendor/etc/vintf/manifest
rm -f [email protected]

Expand All @@ -98,7 +69,7 @@ add following content to `/vendor/etc/vintf/manifest.xml`

Finally, reboot.

#### 2. Add system lib in default link space
#### 2.4.2 Add system lib in default link space

to solve link fail in namespace(default): dlopen failed: library "libandroidfw.so" not found: needed by /vendor/lib64/libandroid.so in namespace (default)

Expand All @@ -107,35 +78,35 @@ In `linkerconfig/ld.config.txt`
```sh
[vendor]
namespace.default.search.paths = /odm/${LIB}
# Add these two lines
# Add these three lines
namespace.default.search.paths += /system/${LIB}
namespace.default.search.paths += /apex/com.android.i18n/${LIB}
namespace.default.search.paths += /apex/com.android.os.statsd/${LIB}
```

### 3. Start service by run android.hardware.neuralnetworks-shell-service-sample on Android board
### 2.5 Start service by run android.hardware.neuralnetworks-shell-service-sample on Android board

### 4. run test with VTS
### 2.6. run test with VTS

```sh
./VtsHalNeuralnetworksTargetTest --gtest_filter=TestGenerated/GeneratedTest.Test/android_hardware_neuralnetworks_IDevice_nnapi_sample_sl_updatable_reshape
```

### 5. run test with CTS
### 2.7 run test with CTS

```sh
./CtsNNAPITestCases64 --gtest_filter=TestGenerated/QuantizationCouplingTest*
```

## Integrate with TfLite
## 3 Integrate with TfLite

### Get the source code of TensorFlow
### 3.1 Get the source code of TensorFlow

```sh
git clone https://github.com/tensorflow/tensorflow.git
```

### Build benchmark_model
### 3.2 Build benchmark_model

```sh
cd tensorflow/tensorflow/lite
Expand All @@ -147,15 +118,8 @@ make benchmark_model -j8

push benchmark_model to board

### Run benchmark_model with support library
### 3.3 Run benchmark_model with support library

```sh
./benchmark_model --graph=mobilenet_v1_1.0_224_quant.tflite --use_nnapi=true --nnapi_support_library_path=/vendor/lib64/libVsiSupportLibrary.so --nnapi_accelerator_name=vsi-device-0
```

## How to pack source for release (VSI internal)

cd build directory
run `make tim-vx Slang VsiSupportLibrary && make package_source`, you will get source code in archived files.

Note: don't create build folder in your source code folder, else the package will include the build directory
```
2 changes: 1 addition & 1 deletion cmake/module/TimVxConfig.cmake
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) 2021 Vivante Corporation
# Copyright (c) 2024 Vivante Corporation
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
Expand Down
33 changes: 0 additions & 33 deletions nnapi_status.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,38 +75,5 @@
| ANeuralNetworks_getDeviceCount | Yes | 5 |
| ANeuralNetworks_getMaximumLoopTimeout | Yes | 5 |
| ANeuralNetworks_getRuntimeFeatureLevel | Yes | 5 |
| - | - | - | - |
| SL_ANeuralNetworksCompilation_setCachingFromFds | Yes | 5 |
| SL_ANeuralNetworksDevice_getNumberOfCacheFilesNeeded | Yes | 5 |
| SL_ANeuralNetworksDevice_getPerformanceInfo | Yes | 5 |
| SL_ANeuralNetworksDevice_forEachOperandPerformanceInfo | Yes | 5 |
| SL_ANeuralNetworksDevice_getVendorExtensionCount | won't suppoprt | 5 |
| SL_ANeuralNetworksDevice_getVendorExtensionName | won't suppoprt | 5 |
| SL_ANeuralNetworksDevice_forEachVendorExtensionOperandTypeInformation | won't suppoprt | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getSessionId | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getNnApiVersion | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getModelArchHash | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getDeviceIds | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getErrorCode | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getInputDataClass | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getOutputDataClass | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_getCompilationTimeNanos | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_isCachingEnabled | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_isControlFlowUsed | No | 5 |
| SL_ANeuralNetworksDiagnosticCompilationInfo_areDynamicTensorsUsed | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getSessionId | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getNnApiVersion | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getModelArchHash | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getDeviceIds | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getExecutionMode | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getInputDataClass | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getOutputDataClass | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getErrorCode | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getRuntimeExecutionTimeNanos | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_getHardwareExecutionTimeNanos | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_isCachingEnabled | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_isControlFlowUsed | No | 5 |
| SL_ANeuralNetworksDiagnosticExecutionInfo_areDynamicTensorsUsed | No | 5 |
| SL_ANeuralNetworksDiagnostic_registerCallbacks | No | 5 |

**NOTE**: test result with imx8mp android 14
64 changes: 64 additions & 0 deletions patches/Android_12/0001-Build-shell-service.patch
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
From 802ec938216e1609df54cd5cf612470c29608a9d Mon Sep 17 00:00:00 2001
From: Xiaoran Weng <[email protected]>
Date: Fri, 29 Dec 2023 10:24:03 +0800
Subject: [PATCH 1/2] Build shell service

---
driver/sample_shim/Android.bp | 8 ++++++++
driver/sample_shim/ShellServiceSample.cpp | 8 +++-----
2 files changed, 11 insertions(+), 5 deletions(-)

diff --git a/driver/sample_shim/Android.bp b/driver/sample_shim/Android.bp
index a4e4d76..e9127b5 100644
--- a/driver/sample_shim/Android.bp
+++ b/driver/sample_shim/Android.bp
@@ -97,3 +97,11 @@ cc_binary {
init_rc: ["config/android.hardware.neuralnetworks-shim-service-sample.rc"],
vintf_fragments: ["config/android.hardware.neuralnetworks-shim-service-sample.xml"],
}
+
+cc_binary {
+ name: "android.hardware.neuralnetworks-shell-service-sample",
+ srcs: ["ShellServiceSample.cpp"],
+ defaults: ["NeuralNetworksShimDriverAidl_server_defaults"],
+ init_rc: ["config/android.hardware.neuralnetworks-shell-service-sample.rc"],
+ vintf_fragments: ["config/android.hardware.neuralnetworks-shell-service-sample.xml"],
+}
diff --git a/driver/sample_shim/ShellServiceSample.cpp b/driver/sample_shim/ShellServiceSample.cpp
index 6c3eda5..1ce61ed 100644
--- a/driver/sample_shim/ShellServiceSample.cpp
+++ b/driver/sample_shim/ShellServiceSample.cpp
@@ -34,8 +34,6 @@
#include <utility>
#include <vector>

-typedef struct NnApiSLDriverImpl NnApiSLDriverImpl;
-
namespace aidl::android::hardware::neuralnetworks {
namespace {

@@ -95,7 +93,7 @@ int registerDevices(const std::string& driverPath, const std::vector<Names>& dev

// The default is 15, use more only if there's more devices exposed.
ANeuralNetworksShimRegistrationParams_setNumberOfListenerThreads(params, 15);
- ANeuralNetworksShimRegistrationParams_registerAsLazyService(params, /*asLazy=*/true);
+ ANeuralNetworksShimRegistrationParams_registerAsLazyService(params, /*asLazy=*/false);
ANeuralNetworksShimRegistrationParams_fallbackToMinimumSupportDevice(params, /*fallback=*/true);

for (const auto& device : devices) {
@@ -123,10 +121,10 @@ using aidl::android::hardware::neuralnetworks::Names;
using aidl::android::hardware::neuralnetworks::registerDevices;

int main() {
- const std::string driverPath = "/vendor/lib64/neuralnetworks_sample_sl_driver_prebuilt.so";
+ const std::string driverPath = "/vendor/lib64/libVsiSupportLibrary.so";

const std::vector<Names> devicesToRegister = {
- {.driverName = "nnapi-sample_sl", .serviceName = "nnapi-sample_sl_updatable"},
+ {.driverName = "vsi-device-0", .serviceName = "nnapi-sample_sl_updatable"},
};

return registerDevices(driverPath, devicesToRegister);
--
2.34.1

60 changes: 60 additions & 0 deletions patches/Android_12/0002-Validate-model-in-shim-driver.patch
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
From af587b3554695cb875c560399bc504816b80d086 Mon Sep 17 00:00:00 2001
From: Xiaoran Weng <[email protected]>
Date: Tue, 2 Jan 2024 09:52:40 +0800
Subject: [PATCH 2/2] Validate model in shim driver

---
shim_and_sl/ShimDevice.cpp | 23 +++++++++++++++++++++++
1 file changed, 23 insertions(+)

diff --git a/shim_and_sl/ShimDevice.cpp b/shim_and_sl/ShimDevice.cpp
index eadbbef..cdadea1 100644
--- a/shim_and_sl/ShimDevice.cpp
+++ b/shim_and_sl/ShimDevice.cpp
@@ -475,6 +475,12 @@ ndk::ScopedAStatus ShimDevice::getSupportedExtensions(std::vector<Extension>* ex

ndk::ScopedAStatus ShimDevice::getSupportedOperations(const Model& model,
std::vector<bool>* supportedOperations) {
+ const auto canonicalModel = ::android::nn::convert(model);
+ if (!canonicalModel.has_value()) {
+ LOG(ERROR) << "HAL model is invalid: " << canonicalModel.error().message;
+ return toAStatus(ErrorStatus::INVALID_ARGUMENT, canonicalModel.error().message);
+ }
+
const auto numOperations = model.main.operations.size();
supportedOperations->resize(numOperations);

@@ -546,6 +552,13 @@ ndk::ScopedAStatus ShimDevice::prepareModel(
return toAStatus(ErrorStatus::INVALID_ARGUMENT);
}

+ const auto canonicalModel = ::android::nn::convert(model);
+ if (!canonicalModel.has_value()) {
+ LOG(ERROR) << "HAL model is invalid: " << canonicalModel.error().message;
+ callback->notify(ErrorStatus::INVALID_ARGUMENT, nullptr);
+ return toAStatus(ErrorStatus::INVALID_ARGUMENT, canonicalModel.error().message);
+ }
+
ErrorStatus convertErrorStatus = ErrorStatus::NONE;
std::vector<uint8_t> copiedOperandValues;
auto modelAndMemory =
@@ -556,6 +569,16 @@ ndk::ScopedAStatus ShimDevice::prepareModel(
return toAStatus(convertErrorStatus);
}

+ std::vector<bool> supportedOps;
+ getSupportedOperations(model, &supportedOps);
+ bool allOpsSupported = std::all_of(supportedOps.cbegin(), supportedOps.cend(),
+ [](bool supported) { return supported; });
+
+ if (!allOpsSupported) {
+ callback->notify(ErrorStatus::INVALID_ARGUMENT, nullptr);
+ return ndk::ScopedAStatus::ok();
+ }
+
// b/185976051, past this point we pretend that compilation is asynchronous, and in
/// case of error we return OK status, but communicate the error through the callback.
auto compilation = ::android::nn::sl_wrapper::Compilation::createForDevice(
--
2.34.1

File renamed without changes.
Loading

0 comments on commit 71bb092

Please sign in to comment.