web-llm icon indicating copy to clipboard operation
web-llm copied to clipboard

No available targets are compatible with triple "wasm32-unknown-unknown-wasm"

Open davidar opened this issue 1 year ago • 5 comments

I'm stuck on the following error:

$ python3 build.py --target webgpu --debug-dump
Load cached module from dist/vicuna-7b-v1/mod_cache_before_build.pkl and skip tracing. You can use --use-cache=0 to retrace
Dump mod to dist/vicuna-7b-v1/debug/mod_before_build.py
Dump mod to dist/vicuna-7b-v1/debug/mod_build_stage.py
Traceback (most recent call last):
  File "/workspaces/web-llm/build.py", line 200, in <module>
    build(mod, ARGS)
  File "/workspaces/web-llm/build.py", line 166, in build
    ex = relax.build(mod_deploy, args.target)
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/relax/vm_build.py", line 325, in build
    return _vmlink(builder, target, tir_mod, ext_libs, params)
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/relax/vm_build.py", line 239, in _vmlink
    lib = tvm.build(tir_mod, target=target, runtime=_autodetect_system_lib_req(target))
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/driver/build_module.py", line 281, in build
    rt_mod_host = _driver_ffi.tir_to_runtime(annotated_mods, target_host)
  File "tvm/_ffi/_cython/./packed_func.pxi", line 331, in tvm._ffi._cy3.core.PackedFuncBase.__call__
  File "tvm/_ffi/_cython/./packed_func.pxi", line 262, in tvm._ffi._cy3.core.FuncCall
  File "tvm/_ffi/_cython/./packed_func.pxi", line 251, in tvm._ffi._cy3.core.FuncCall3
  File "tvm/_ffi/_cython/./base.pxi", line 181, in tvm._ffi._cy3.core.CHECK_CALL
tvm._ffi.base.TVMError: Traceback (most recent call last):
  6: TVMFuncCall
  5: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)>::AssignTypedLambda<tvm::{lambda(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)#6}>(tvm::{lambda(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)#6}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  4: tvm::TIRToRuntime(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target const&)
  3: tvm::codegen::Build(tvm::IRModule, tvm::Target)
  2: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::IRModule, tvm::Target)>::AssignTypedLambda<tvm::codegen::{lambda(tvm::IRModule, tvm::Target)#1}>(tvm::codegen::{lambda(tvm::IRModule, tvm::Target)#1}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  1: tvm::codegen::LLVMModuleNode::Init(tvm::IRModule const&, tvm::Target const&)
  0: tvm::codegen::LLVMTargetInfo::GetOrCreateTargetMachine(bool)
  File "/workspace/tvm/src/target/llvm/llvm_instance.cc", line 302
TVMError: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (target_machine_ != nullptr) is false: No available targets are compatible with triple "wasm32-unknown-unknown-wasm"

I found a reference to this error here, but since I'm using the nightly TVM build recommended in the readme I'm not clear what I can do to fix this.

davidar avatar Apr 23 '23 05:04 davidar

I got a similar error, I tested it both on wsl2 and Ubuntu 22.04

LiliumSancta avatar Apr 28 '23 14:04 LiliumSancta

I have the same problem. It occurs on centos

takfate avatar Apr 29 '23 19:04 takfate

Same issue. Really need some help over here.

Identical error.

dan9070 avatar Apr 30 '23 22:04 dan9070

Likely have to do with the llvm shipped do not come with wasm, you can try to swap out the LLVM and build from source in a branch say here https://github.com/mlc-ai/relax

tqchen avatar Apr 30 '23 22:04 tqchen

I encountered the same issue on my Ubuntu22 instance. Building relax from source helps me resolve this issue. you can refer to the guide https://tvm.apache.org/docs/install/from_source.html#developers-get-source-from-github. I also follows the steps in the script https://github.com/mlc-ai/relax/blob/mlc/tests/scripts/task_web_wasm.sh to build web wasm.

yongwww avatar May 10 '23 22:05 yongwww