mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

[Bug] Execute debug_compare.py Failed with Error InternalError: Check failed: (code == RPCCode::kReturn) is false: code=kShutdown

Open Msyu1020 opened this issue 1 year ago • 0 comments

🐛 Bug

I want to profile operators on mobile devices. I've installed TVM PRC APP on device and successfully run \tvm-unity\apps\android_rpc\tests\android_rpc_test.py. However when try to run debug_compare.py, it failed and return InternalError.

Besides, debug_compare.py makes sure the model lib file ends with '.so' when arg cmp-device==android, but the prebuilt lib for android ends with '.tar' ? https://github.com/mlc-ai/binary-mlc-llm-libs/tree/main/Llama-2-7b-chat-hf

To Reproduce

Steps to reproduce the behavior:

  1. Run python -m tvm.exec.rpc_tracker --host=192.168.3.250 --port=9191 on PC and return INFO bind to 192.168.3.250:9191
  2. Change TVM PRC APP config to the same as host=192.168.3.250 port=9191
  3. Run command python debug_compare.py "hello" --generate-len 32 --model .\Llama-2-7b-chat-hf-q4f16_1-MLC --model-lib .\Llama-2-7b-chat-hf-q4f16_1-vulkan.so --debug-dir D:\output --cmp-device android --cmp-lib-path .\Llama-2-7b-chat-hf-q4f16_1-android.tar --time-eval

Expected behavior

Traceback (most recent call last):
  File "D:\condaFiles\mlc-llm\python\mlc_llm\testing\debug_compare.py", line 255, in <module>
    main()
  File "D:\condaFiles\mlc-llm\python\mlc_llm\testing\debug_compare.py", line 236, in main
    instrument = get_instrument(parsed)
                 ^^^^^^^^^^^^^^^^^^^^^^
  File "D:\condaFiles\mlc-llm\python\mlc_llm\testing\debug_compare.py", line 169, in get_instrument
    lib = sess.load_module(os.path.basename(args.cmp_lib_path))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\miniconda\envs\mlc-chat-new\Lib\site-packages\tvm\rpc\client.py", line 178, in load_module
    return _ffi_api.LoadRemoteModule(self._sess, path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\miniconda\envs\mlc-chat-new\Lib\site-packages\tvm\_ffi\_ctypes\packed_func.py", line 240, in __call__
    raise_last_ffi_error()
  File "D:\miniconda\envs\mlc-chat-new\Lib\site-packages\tvm\_ffi\base.py", line 481, in raise_last_ffi_error
    raise py_err
tvm._ffi.base.TVMError: Traceback (most recent call last):
  File "D:\a\package\package\tvm\src\runtime\rpc\rpc_endpoint.cc", line 871
InternalError: Check failed: (code == RPCCode::kReturn) is false: code=kShutdown

Environment

  • Platform (e.g. WebGPU/Vulkan/IOS/Android/CUDA): Android
  • Operating system (e.g. Ubuntu/Windows/MacOS/...): Windows
  • Device (e.g. iPhone 12 Pro, PC+RTX 3090, ...): vivo Pad 3 pro
  • How you installed MLC-LLM (conda, source): conda
  • How you installed TVM-Unity (pip, source): pip
  • Python version (e.g. 3.10): 3.11
  • GPU driver version (if applicable):
  • CUDA/cuDNN version (if applicable):
  • TVM Unity Hash Tag (python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))", applicable if you compile models):
  • Any other relevant information:

Additional context

Execute \tvm-unity\apps\android_rpc\tests\android_rpc_test.py

Run CPU test ...
4.33231e-05 secs/op

Run GPU(OpenCL Flavor) test ...
1.62521e-05 secs/op

Msyu1020 avatar Aug 18 '24 15:08 Msyu1020