mlc-llm
mlc-llm copied to clipboard
tvm::runtime::InternalError relax/src/runtime/relax_vm/lm_support.cc:247 Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan)
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) :
Stack trace:
[bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116
[bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0
[bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0
[bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544
[bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits
It occurs only when the Metal binary is not properly build. Would you like to double check?
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
Hey, how did you solve the problem? Can you please share?
Metal binary
Could you please explain which step "Metal binary" is in
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
Hey, how did you solve the problem? Can you please share?
sorry i do not solve this problem yet
It occurs only when the Metal binary is not properly build. Would you like to double check?
thanks for replying really appreciate would you be more specific about which building step causing metal binary not be properly built? Thanks
We've fixed several related issues in the recent month, but could you guys double check if the issue persists? If so, please open a new issue with detailed information so that I could help with!