TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

❓ [Question] C++ Windows runtime error

Open zsef123 opened this issue 5 months ago • 2 comments

❓ Question

How can I fix this error?

Unknown type name '__torch__.torch.classes.tensorrt.Engine':
  File "code/__torch__/torch_tensorrt/dynamo/runtime/_TorchTensorRTModule.py", line 6
  training : bool
  _is_full_backward_hook : Optional[bool]
  engine : __torch__.torch.classes.tensorrt.Engine
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
  def forward(self: __torch__.torch_tensorrt.dynamo.runtime._TorchTensorRTModule.TorchTensorRTModule,
    x: Tensor) -> Tensor:

Run script

torch::jit::Module trt_ts_mod;
try {
    // Deserialize the ScriptModule from a file using torch::jit::load().
	std::cout << "Loading TRT engine from: " << trt_ts_module_path << std::endl;
    trt_ts_mod = torch::jit::load(trt_ts_module_path);
	std::cout << "TRT engine loaded successfully." << std::endl;
}
catch (const c10::Error& e) {
    std::cerr << "c10::Error loading the model from : " << trt_ts_module_path << std::endl;
    return -1;
}
catch (const std::exception& e) {
    std::cerr << "std::exception occurred while loading the model: " << e.what() << std::endl;
    return -1;
}

Environment

CMakeListst.txt

cmake_minimum_required(VERSION 3.17)
project(torchtrt_runtime_example LANGUAGES CXX)

find_package(Torch REQUIRED)
find_package(torchtrt REQUIRED)

set(SRCS
    main.cpp
)

include_directories("${PRJ_ROOT}/TensorRT/out/install/x64-Release/include")

add_executable(${CMAKE_PROJECT_NAME} ${SRCS})
target_link_libraries(${CMAKE_PROJECT_NAME} PRIVATE torch "-Wl,--no-as-needed" torchtrt_runtime "-Wl,--as-needed")
target_compile_features(${CMAKE_PROJECT_NAME} PRIVATE cxx_std_17)

I build self TensorRT and Torch-TensorRT both

  • PyTorch Version (e.g., 1.0): libtorch-win-shared-with-deps-2.8.0+cu126
  • CPU Architecture: ryzen 2700
  • OS (e.g., Linux): Windows 11
  • Python version: 3.12
  • CUDA version: 12.6
  • GPU models and configuration: RTX 3070

zsef123 avatar Aug 08 '25 07:08 zsef123

Are you able to verify that torchtrt_runtime is being linked properly? I see "-Wl,--no-as-needed" torchtrt_runtime "-Wl,--as-needed" but not sure if Windows has some other method for this.

Almost every case of Unknown type name '__torch__.torch.classes.tensorrt.Engine' is due to the runtime library being optimized out and not being linked properly

narendasan avatar Aug 08 '25 19:08 narendasan

Here is a related case https://github.com/pytorch/TensorRT/issues/3401#issuecomment-2674036551

narendasan avatar Aug 15 '25 14:08 narendasan