wasi-nn icon indicating copy to clipboard operation
wasi-nn copied to clipboard

Benchmarking how to

Open brianjjones opened this issue 2 years ago • 3 comments

All the info needed to install / setup required dependencies and benchmark wasi-nn performance on any supported hardware.

brianjjones avatar Jun 10 '22 23:06 brianjjones

Installing needed dependencies for Ubuntu 20.04

wasi-nn

git clone https://github.com/bytecodealliance/wasi-nn.git
cd wasi-nn
## Currently benchmarking stuff lives in this branch
git checkout performance

Wasmtime

curl https://wasmtime.dev/install.sh -sSf | bash

Build tools

sudo apt install build-essential
sudo apt install libssl-dev
sudo apt install pkg-config

OpenVINO

wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
echo "deb https://apt.repos.intel.com/openvino/2022 focal main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list
sudo apt update
sudo apt install openvino
sudo ln -s /opt/intel/openvino_2022.1.0.643 /opt/intel/openvino

Then you'll need to edit wasi-nn/build.sh, and change source /opt/intel/openvino/bin/setupvars.sh to source /opt/intel/openvino/setupvars.sh

TensorFlow

FILENAME=libtensorflow-cpu-linux-x86_64-2.8.0.tar.gz
wget -q --no-check-certificate https://storage.googleapis.com/tensorflow/libtensorflow/${FILENAME}
sudo tar -C /usr/local -xzf ${FILENAME}
sudo ldconfig /usr/local/lib

NOTE: You'll need to build wasmtime yourself, see below for info.

Wasmtime w/TensorFlow support

git clone https://github.com/brianjjones/wasmtime.git
cd wasmtime
git checkout tf_runtime_linked
cargo build --release
cp target/release/wasmtime ~/.wasmtime/bin/wasmtime-tf
cd ~/.wasmtime/bin
mv wasmtime wasmtime-orig
ln -s wasmtime-tf wasmtime

brianjjones avatar Jun 10 '22 23:06 brianjjones

Running the benchmark in wasi-nn

The script build.sh is currently used to run the benchmark. It accepts a number of flags to configure how it gets run.

Flags

-b Backend to use either openvino or tensorflow. Defaults to openvino -m Model to use either mobilenet_v2 or inception_v3. Defaults to mobilenet_v2 -l Number of times to run the benchmark. Default is 1 -t Build type, either rust or as. Defaults to rust -o Out directory to save *.csv files to. Defaults to rust/examples/classification-example/build/RESULTS -c CPU info, whatever text you want to have it labeled as. Defaults to UNKNOWN -s Select the beginning CPU to use. Defaults to 0 -e Select the ending CPU to use. Defaults to 0 -z Thread test batch size. Use this with -j to run the benchmark -z times, upping the threads by -j. Default is 1 -j Thread jump size. Default is 8

Examples

Run with an increasing number of CPUs. Here the backend is OpenVINO, model is Inception, loop 50 times, the name used for the graph will be "Desktop", it starts with one CPU (start and end CPU are the same), runs entire test (50 loops) 10 times, each time upping the CPU + 2. Once graphed this will show you which number of CPUs offers the best performance. 1, 3, 5, 7, 9, 11, 13, 15,17, or 19 CPUs.

./build.sh -b openvino -m inception_v3 -l 50  -c "Desktop" -s 0 -e 0  -z 10 -j 2

Run with a fixed number of CPUs, in this case 32. It will loop 1000 times.

./build.sh -b openvino -m mobilenet_v2 -l 1000  -c "Desktop" -s 0 -e 31

brianjjones avatar Jun 11 '22 00:06 brianjjones

Graphing

Building dependencies

pip install matplotlib
pip install numpy

Creating graphs

**WARNING**! - Currently it will just overwrite your *.png each time you run these. So if you want to save one, make sure and rename it first.

  • If you ran with an increasing number of CPUs and want a bar chart, run python3 barchart.py. This creates barchart.png
  • To create a line chart using all the results currently in RESULTS run python linechart.py. This creates linechart.png

brianjjones avatar Jun 11 '22 00:06 brianjjones