Skip to content

Lite Architecture Benchmark

gzydsmz edited this page Jul 2, 2019 · 6 revisions

Benchmark v0.1

mobile net V1
1 thread 2 thread 4 thread
Qualcomm 435 243.05 ms 123.647 ms 84.673 ms
Qualcomm 625 183.97 ms 94.495 ms 54.256 ms
Qualcomm 835 91.071 ms 48.809 ms 27.581 ms
Qualcomm 845 64.3953 ms 33.816 ms 20.131 ms
MTK Helio P60 237.793 ms 121.988 ms 68.516 ms
arm linux(rk3399) 120.819 ms 65.0624 ms --
mobile net v2
1 thread 2 thread 4 thread
Qualcomm 435 175.27 ms 89.214 ms 61.344 ms
Qualcomm 625 137.464 ms 73.499 ms 46.679 ms
Qualcomm 835 60.135 ms 38.197 ms 21.134 ms
Qualcomm 845 42.81 ms 23.581 ms 14.114 ms
MTK Helio P60 184.172 ms 97.292 ms 56.531 ms
arm linux(rk3399) 95.654 ms 51.881 ms --
inception v4
1 thread 2 thread 4 thread
Qualcomm 435 2545.87 ms 1325.26 ms 925.012 ms
Qualcomm 625 2021.72 ms 1042.22 ms 614.647 ms
Qualcomm 835 895.008 ms 527.226 ms 326.85 ms
Qualcomm 845 647.563 ms 390.266 ms 259.983 ms
MTK Helio P60 2393.99 ms 1284.46 ms 745.351 ms
arm linux(rk3399) 1141.35 ms 659.916 ms --
Resnet50
1 thread 2 thread 4 thread
Qualcomm 435 1779.96 ms 907.695 ms 594.149 ms
Qualcomm 625 1401.76 ms 714.77 ms 402.68 ms
Qualcomm 835 656.405 ms 374.671 ms 217.913 ms
Qualcomm 845 454.95 ms 254.511 ms 170.833 ms
MTK Helio P60 1696.32 ms 886 ms 485.072 ms
arm linux(rk3399) 834.659 ms 461.797 ms --

In order to reproduce the benchmark test results, follow these steps:

  1. Checkout the latest paddle lite mobile code.

  2. Go to paddle lite source directory and build docker image

cd <paddle-lite-repo>
docker build --file paddle/fluid/lite/tools/Dockerfile.mobile --tag paddle-lite-mobile:latest .
  1. Start the docker container. This will start the newly created docker image, and mount the paddle lite source directory to /paddle-lite path.
docker run -it --name <your_container_name> --net=host --privileged -v $(pwd):/paddle-lite paddle-lite-mobile bash
  1. Paddle lite makes use of cmake to build benchmark program. In docker shell, go to paddle lite source directory and create a build directory.
cd /paddle-lite && mkdir build.armv8 && cd build.armv8
  1. Run cmake command:

this will create the make file for armv8

cmake .. \
    -DWITH_GPU=OFF \
    -DWITH_LITE=ON \
    -DLITE_WITH_CUDA=OFF \
    -DLITE_WITH_X86=OFF \
    -DLITE_WITH_ARM=ON \
    -DLITE_WITH_LIGHT_WEIGHT_FRAMEWORK=ON \
    -DWITH_TESTING=ON \
    -DWITH_MKL=OFF \
-DARM_TARGET_OS="android" -DARM_TARGET_ARCH_ABI="arm64-v8a"
  1. Run make command:
make -j 4
  1. The built benchmark binary is <paddle-lite-mobile-source-root>/build.armv8/paddle/fluid/lite/api/test_model_bin. Use android adb to push the binary into your android phone. After connecting your phone to your computer, please execute the following command outside docker shell in paddle lite source directory:
adb push ./build.armv8/paddle/fluid/lite/api/test_model_bin /data/local/tmp/
adb shell chmod +x /data/local/tmp/test_model_bin
  1. Now push a test model into your phone. Contemporarily we support 4 models:
model name model
mobilenet v1 http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz
mobilenet v2 http://paddle-inference-dist.bj.bcebos.com/mobilenet_v2_relu.tar.gz
inception v4 http://paddle-inference-dist.bj.bcebos.com/inception_v4.tar.gz
resnet50 http://paddle-inference-dist.bj.bcebos.com/resnet50.tar.gz

download the models you need. Here I take mobilenet v1 for an example, please extract and push it into your phone.

tar xzvf mobilenet_v1.tar.gz
adb push mobilenet_v1 /data/local/tmp/mobilenet_v1
  1. Start the benchmark program
adb shell /data/local/tmp/test_model_bin /data/local/tmp/mobilenet_v1 50 1
Clone this wiki locally