Build TVM Docker Container Environment Link to heading
Build the TVM Docker container to ensure we have the same environment.
(You can skip this section if you know how to install the dependent package and tvm4j. And you are familiar with the hierarchy of the folder of the tvm.)
Install Docker. https://docs.docker.com/install/
Clone the TVM repo.
$ git clone --depth 1 https://github.com/apache/incubator-tvm.git tvm
Build the Docker image using the Dockerfile
Dockerfile.demo_android
in the foldertvm/docker
.$ cd tvm/docker/
$ bash ./build.sh demo_android -it bash
Exit from the temp container using
ctrl+D
.Build the TVM Docker container and attach it.
$ docker run -it --name tvm tvm.demo_android
$ docker start tvm && docker attach tvm
Install tvm4j.
$ apt install maven
$ cd /usr/tvm/
$ make jvmdkg
$ make jvminstall
Test the Model Running Well on TVM Link to heading
Copy the onnx into the Docker container using docker cp.
Install onnx.
$ pip3 install onnx
Run the script below to test the model.
import onnx import numpy as np import tvm import tvm.relay as relay from tvm.contrib import graph_runtime # Change this to match the input of your model. input = np.ones([1,3,256,256]) # Change this to match the filename of your model. onnx_model = onnx.load('model.onnx') # Change this to match the shape of input of your model. x = np.ones((1, 3, 256, 256)) # Change this to match the input name of your model. input_name = 'input.1' target = 'llvm' shape_dict = {input_name: x.shape} sym, params = relay.frontend.from_onnx(onnx_model, shape_dict) ctx = tvm.context(target, 0) with relay.build_config(opt_level=0): intrp = relay.build_module.create_executor('graph', sym, ctx, target) with relay.build_config(opt_level=2): graph, lib, params = relay.build_module.build(sym, target, params=params) dtype = np.float32 module = graph_runtime.create(graph, lib, ctx) module.set_input(**params) module.set_input(input_name, tvm.nd.array(input.astype(dtype))) module.run() output = module.get_output(0).asnumpy() # May change this to match the output type of your model. print(output)
Cross-compile the Model Link to heading
Run the script below and you will get three files
(model.so
, model.json
, model.params
).
import onnx
import numpy as np
import tvm
import tvm.relay as relay
# Change this to match the filename of your model.
onnx_model = onnx.load('model.onnx')
# Change this to match the shape of input of your model.
x = np.ones((1, 3, 256, 256))
# Change this to match the input name of your model.
input_name = 'input.1'
arch = 'arm64'
target = 'llvm -target=%s-linux-android' % arch
shape_dict = {input_name: x.shape}
sym, params = relay.frontend.from_onnx(onnx_model, shape=shape_dict)
with relay.build_config(opt_level=0):
intrp = relay.build_module.create_executor('graph', sym, tvm.cpu(0), target)
with relay.build_config(opt_level=2):
graph, lib, params = relay.build_module.build(sym, target, params=params)
libpath = 'model.so'
# Change the parameter `cc` to match the architecture of your phone.
# You can run `adb shell cat /proc/cpuinfo` to list the info of your CPU.
# This is for Android SDK 28 (Pie) and CPU is aarch64.
lib.export_library(libpath, cc='/opt/android-sdk-linux/ndk-bundle/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android28-clang')
graph_json_path = 'model.json'
with open(graph_json_path, 'w') as fo:
fo.write(graph)
param_path = 'model.params'
with open(param_path, 'wb') as fo:
fo.write(relay.save_param_dict(params))
Write the Android Program Link to heading
In the folder tvm/apps/android_deploy
, you will see an example provided by
TVM. You can compile the Android program first to know what each function does, or you can modify the files according to
README.md
Moreover, here is an Android program that I deployed the style transfer models which were trained by Tony Tseng.
Compile the Android Program Link to heading
Change the directory to the root of the Android program.
$ cd /usr/tvm/apps/android_deploy
Generate the APK file.
$ gradle clean build --no-daemon
Create the key that is used to sign APK if you don't have one.
$ bash ./dev_tools/gen_keystore.sh
Sign the APK file.
$ bash ./dev_tools/sign_apk.sh
The signed APK file will be
./app/build/outputs/apk/release/tvmdemo-release.apk
Copy the APK file from the Docker container.