wasm-micro-runtime
wasm-micro-runtime copied to clipboard
Request for TensorFlow Lite Workload Update
Dear developers,
I've been working with the TensorFlow Lite workload found in your repository (https://github.com/bytecodealliance/wasm-micro-runtime/blob/main/samples/workload/tensorflow) which is currently using TensorFlow v2.4. Kudos to @wustwn for the original implementation.
Considering the benefits of newer versions of TensorFlow Lite, I wanted to inquire if there are any plans to update this workload to be compatible with a more recent version, either by the WAMR team or directly @wustwn?
Many thanks!
Cheers, Jämes
thanks for asking. We'll take a look.
Hi lum1n0us,
Thanks for considering the TF-lite port upgrade. To provide more insights on why we would need this upgrade: we are evaluating multiple workloads for a scientific paper, with TF-lite being our top choice. However, the current version's missing features are a barrier to our benchmarking.
We truly appreciate your continued support and effort on this.
Cheers!
Hi there!
I would also be really interested in an update to a newer Tensorflow version! Some operations are missing and the version here is quite old now.
Any idea if it's feasible?
Kind regards, Nils
https://github.com/bytecodealliance/wasm-micro-runtime/pull/2369 is an ongoing PR. we've resolved all compilation problems. Now we are working on issues caused by "emscripten standalone supporting".
https://github.com/emscripten-core/emscripten/blob/main/system/lib/standalone/standalone.c#L139 and https://github.com/emscripten-core/emscripten/blob/main/system/lib/standalone/standalone.c#L92.
We have a patch for getentropy()
https://github.com/bytecodealliance/wasm-micro-runtime/blob/main/samples/workload/XNNPACK/xnnpack.patch#L64-L71
openat
is ongoing. We plan to map it to WASI-IO system. It may take some time.
Hello @lum1n0us ,
This looks super great! Many thanks for your effort! I'm looking forward to testing it when you think it's ready :)
Cheers
Hello @lum1n0us,
Were you able to resolve the challenges you encountered with Emscripten? I'm eager to try it out when I get the chance. :)
Thanks and Best regards!
@JamesMenetrey There is a workable draft PR. please take a look and let me know your opinion
@JamesMenetrey Hey could you take a look ^? It's just that maybe the fix can be merged and it's one less worry for WAMR-1.2.4 release
Hello @lum1n0us and @g0djan,
Thanks for your work!
I have tried to build the tflite workload as described by the README using the remote lum1n0us:tflite_2.13, but executing cmake --build build
leads to these errors:
❯ cmake --build build
[ 5%] Performing update step for 'tensorflow-download'
[ 11%] No configure step for 'tensorflow-download'
[ 16%] No build step for 'tensorflow-download'
[ 22%] No install step for 'tensorflow-download'
[ 27%] No test step for 'tensorflow-download'
[ 33%] Completed 'tensorflow-download'
[ 50%] Built target tensorflow-download
[ 55%] Generating benchmark_model.wasm ...
INFO: Options provided by the client:
Inherited 'common' options: --isatty=1 --terminal_columns=270
INFO: Reading rc options for 'build' from /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc:
Inherited 'common' options: --experimental_repo_remote_exec
INFO: Reading rc options for 'build' from /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc:
'build' options: --define framework_shared_object=true --define tsl_protobuf_header_only=true --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --define=with_xla_support=true --config=short_logs --config=v2 --define=no_aws_support=true --define=no_hdfs_support=true --experimental_cc_shared_library --experimental_link_static_libraries_once=false --incompatible_enforce_config_setting_visibility
INFO: Reading rc options for 'build' from /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc.user:
'build' options: --incompatible_enable_cc_toolchain_resolution
INFO: Reading rc options for 'build' from /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc:
'build' options: --deleted_packages=tensorflow/compiler/mlir/tfrt,tensorflow/compiler/mlir/tfrt/benchmarks,tensorflow/compiler/mlir/tfrt/jit/python_binding,tensorflow/compiler/mlir/tfrt/jit/transforms,tensorflow/compiler/mlir/tfrt/python_tests,tensorflow/compiler/mlir/tfrt/tests,tensorflow/compiler/mlir/tfrt/tests/ir,tensorflow/compiler/mlir/tfrt/tests/analysis,tensorflow/compiler/mlir/tfrt/tests/jit,tensorflow/compiler/mlir/tfrt/tests/lhlo_to_tfrt,tensorflow/compiler/mlir/tfrt/tests/lhlo_to_jitrt,tensorflow/compiler/mlir/tfrt/tests/tf_to_corert,tensorflow/compiler/mlir/tfrt/tests/tf_to_tfrt_data,tensorflow/compiler/mlir/tfrt/tests/saved_model,tensorflow/compiler/mlir/tfrt/transforms/lhlo_gpu_to_tfrt_gpu,tensorflow/core/runtime_fallback,tensorflow/core/runtime_fallback/conversion,tensorflow/core/runtime_fallback/kernel,tensorflow/core/runtime_fallback/opdefs,tensorflow/core/runtime_fallback/runtime,tensorflow/core/runtime_fallback/util,tensorflow/core/tfrt/eager,tensorflow/core/tfrt/eager/backends/cpu,tensorflow/core/tfrt/eager/backends/gpu,tensorflow/core/tfrt/eager/core_runtime,tensorflow/core/tfrt/eager/cpp_tests/core_runtime,tensorflow/core/tfrt/gpu,tensorflow/core/tfrt/run_handler_thread_pool,tensorflow/core/tfrt/runtime,tensorflow/core/tfrt/saved_model,tensorflow/core/tfrt/graph_executor,tensorflow/core/tfrt/saved_model/tests,tensorflow/core/tfrt/tpu,tensorflow/core/tfrt/utils,tensorflow/core/tfrt/utils/debug
INFO: Found applicable config definition build:short_logs in file /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc: --output_filter=DONT_MATCH_ANYTHING
INFO: Found applicable config definition build:v2 in file /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:wasm in file /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc.user: --copt=-Wno-unused --copt=-Wno-unused-function --copt=-Wno-unused-but-set-variable --cxxopt=-std=c++17 --host_cxxopt=-std=c++17
INFO: Found applicable config definition build:linux in file /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc: --define=build_with_onednn_v2=true --host_copt=-w --copt=-Wno-all --copt=-Wno-extra --copt=-Wno-deprecated --copt=-Wno-deprecated-declarations --copt=-Wno-ignored-attributes --copt=-Wno-array-bounds --copt=-Wunused-result --copt=-Werror=unused-result --copt=-Wswitch --copt=-Werror=switch --copt=-Wno-error=unused-but-set-variable --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --define=PROTOBUF_INCLUDE_PATH=$(PREFIX)/include --cxxopt=-std=c++17 --host_cxxopt=-std=c++17 --config=dynamic_kernels --experimental_guard_against_concurrent_changes
INFO: Found applicable config definition build:dynamic_kernels in file /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/.bazelrc: --define=dynamic_loaded_kernels=true --copt=-DAUTOLOAD_DYNAMIC_KERNELS
WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/github.com/llvm/llvm-project/archive/dc275fd03254d67d29cc70a5a0569acf24d2280d.tar.gz failed: class java.io.FileNotFoundException GET returned 404 Not Found
WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/github.com/tensorflow/runtime/archive/7d879c8b161085a4374ea481b93a52adb19c0529.tar.gz failed: class java.io.FileNotFoundException GET returned 404 Not Found
ERROR: Skipping '//tensorflow/lite/tools/benchmark:benchmark_model-wasm': no such target '//tensorflow/lite/tools/benchmark:benchmark_model-wasm': target 'benchmark_model-wasm' not declared in package 'tensorflow/lite/tools/benchmark' (did you mean 'benchmark_model_main'?) defined by /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/tensorflow/lite/tools/benchmark/BUILD
WARNING: Target pattern parsing failed.
ERROR: no such target '//tensorflow/lite/tools/benchmark:benchmark_model-wasm': target 'benchmark_model-wasm' not declared in package 'tensorflow/lite/tools/benchmark' (did you mean 'benchmark_model_main'?) defined by /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/tensorflow/lite/tools/benchmark/BUILD
INFO: Elapsed time: 0.121s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
make[2]: *** [CMakeFiles/benchmark_model.dir/build.make:61: benchmark_model.wasm] Error 1
make[1]: *** [CMakeFiles/Makefile2:80: CMakeFiles/benchmark_model.dir/all] Error 2
make: *** [Makefile:84: all] Error 2
Did I miss something for building it properly?
Observations while working on this PR:
- the README mentions the following build steps for
iwasm
:
$ cd <wamr_dir>
$ cd product-mini/platforms/linux
> [!TODO]
> lib-pthread
$ cmake -S . -B build -DWAMR_BUILD_LIBC_EMCC=1 <other cmake options>
$ cmake --build build --target iwasm
It wasn't clear to me whether I needed to build WAMR with pthread enabled. I ended up building iwasm
with cmake -S . -B build -DWAMR_BUILD_LIBC_EMCC=1 -DWAMR_BUILD_LIB_PTHREAD=1
.
- Pulling the latest version of bazel (6.4.0) leads to an error indicating that building tensorflow requires Bazel 5.3.0. Hence, I had to install this version explicitly (
apt install bazel-5.3.0
) and create a symbolic link for it (ln -s /usr/bin/bazel-5.3.0 /usr/bin/bazel
).
Thanks!
It seems bazel build
didn't work as we expected.
WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/github.com/llvm/llvm-project/archive/dc275fd03254d67d29cc70a5a0569acf24d2280d.tar.gz failed: class java.io.FileNotFoundException GET returned 404 Not Found
WARNING: Download from https://storage.googleapis.com/mirror.tensorflow.org/github.com/tensorflow/runtime/archive/7d879c8b161085a4374ea481b93a52adb19c0529.tar.gz failed: class java.io.FileNotFoundException GET returned 404 Not Found
ERROR: Skipping '//tensorflow/lite/tools/benchmark:benchmark_model-wasm': no such target '//tensorflow/lite/tools/benchmark:benchmark_model-wasm': target 'benchmark_model-wasm' not declared in package 'tensorflow/lite/tools/benchmark' (did you mean 'benchmark_model_main'?) defined by /home/ubuntu/dev/test_tflite_with_wamr/samples/workload/tensorflow/tensorflow/tensorflow/lite/tools/benchmark/BUILD
my guess is:
- if the patch was applied successfully?
- if bazel downloaded all dependencies successfully?
in order to figure that out:
- goto tensorflow source directory. it should be /samples/workload/tensorflow/tensorflow. use
git status
to check modification - use
bazel query --output label_kind "//tensorflow/lite/..." | grep benchmark_model-wasm
to make sure bazel has all necessary
Hello @lum1n0us,
Thanks for your response, and I'm sorry for the delay. I have found the issue that caused the compilation to fail and described it directly in the PR: https://github.com/bytecodealliance/wasm-micro-runtime/pull/2369#pullrequestreview-1909303007.
I suggest we continue to discuss the changes over there.
Cheers