onnx-tensorrt icon indicating copy to clipboard operation
onnx-tensorrt copied to clipboard

onnx2trt command not found

Open wangzh63 opened this issue 3 years ago • 4 comments

Description

I tried to build the library inside the docker container, but command not found after I make it.

Steps to reproduce

./docker/launch.sh --tag tensorrt-ubuntu18.04-cuda11.6 --gpus all sudo apt-get install libprotobuf-dev protobuf-compiler sudo git clone --recurse-submodules https://github.com/onnx/onnx-tensorrt.git cd onnx-tensorrt sudo mkdir build && cd build sudo cmake .. -DTENSORRT_ROOT="/workspace/TensorRT" && make -j8

Compile process

--   CMake version             : 3.14.4
--   CMake command             : /usr/local/bin/cmake
--   System                    : Linux
--   C++ compiler              : /usr/bin/c++
--   C++ compiler version      : 7.5.0
--   CXX flags                 :  -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
--   Build type                : Release
--   Compile definitions       : SOURCE_LENGTH=25;ONNX_NAMESPACE=onnx2trt_onnx;__STDC_FORMAT_MACROS
--   CMAKE_PREFIX_PATH         : 
--   CMAKE_INSTALL_PREFIX      : /usr/local
--   CMAKE_MODULE_PATH         : 
-- 
--   ONNX version              : 1.12.0
--   ONNX NAMESPACE            : onnx2trt_onnx
--   ONNX_USE_LITE_PROTO       : OFF
--   USE_PROTOBUF_SHARED_LIBS  : OFF
--   Protobuf_USE_STATIC_LIBS  : ON
--   ONNX_DISABLE_EXCEPTIONS   : OFF
--   ONNX_WERROR               : OFF
--   ONNX_BUILD_TESTS          : OFF
--   ONNX_BUILD_BENCHMARKS     : OFF
--   ONNXIFI_DUMMY_BACKEND     : OFF
--   ONNXIFI_ENABLE_EXT        : OFF
-- 
--   Protobuf compiler         : /usr/bin/protoc
--   Protobuf includes         : /usr/include
--   Protobuf libraries        : /usr/lib/x86_64-linux-gnu/libprotobuf.so;-lpthread
--   BUILD_ONNX_PYTHON         : OFF
-- Found CUDA headers at /usr/local/cuda/include
-- Found TensorRT headers at /workspace/TensorRT/include
-- Find TensorRT libs at /usr/lib/x86_64-linux-gnu/libnvinfer.so;/usr/lib/x86_64-linux-gnu/libnvinfer_plugin.so
-- Found TENSORRT: /workspace/TensorRT/include  
-- Configuring done
-- Generating done
-- Build files have been written to: /workspace/onnx-tensorrt/build

[  4%] Built target gen_onnx_proto
[  9%] Built target gen_onnx_operators_proto
[ 14%] Built target gen_onnx_data_proto
[ 38%] Built target onnx_proto
[ 69%] Built target nvonnxparser_static
[100%] Built target nvonnxparser
Install the project...
-- Install configuration: "Release"
-- Installing: /usr/local/lib/libnvonnxparser.so.8.4.3
-- Installing: /usr/local/lib/libnvonnxparser.so.8
-- Installing: /usr/local/lib/libnvonnxparser.so
-- Installing: /usr/local/lib/libnvonnxparser_static.a
root@7a80da9ff08a:/workspace/onnx-tensorrt/build# onnx2trt
bash: onnx2trt: command not found

wangzh63 avatar Aug 25 '22 08:08 wangzh63

same problem [ 4%] Built target gen_onnx_proto [ 8%] Built target gen_onnx_data_proto [ 13%] Built target gen_onnx_operators_proto Consolidate compiler generated dependencies of target onnx_proto [ 35%] Built target onnx_proto Consolidate compiler generated dependencies of target nvonnxparser [ 64%] Built target nvonnxparser Consolidate compiler generated dependencies of target nvonnxparser_static [ 93%] Built target nvonnxparser_static Consolidate compiler generated dependencies of target getSupportedAPITest [100%] Built target getSupportedAPITest Install the project... -- Install configuration: "Release" -- Installing: /usr/local/lib/libnvonnxparser.so.8.5.1 -- Up-to-date: /usr/local/lib/libnvonnxparser.so.8 -- Set runtime path of "/usr/local/lib/libnvonnxparser.so.8.5.1" to "" -- Up-to-date: /usr/local/lib/libnvonnxparser.so -- Installing: /usr/local/lib/libnvonnxparser_static.a

lpj0711 avatar Jan 18 '23 09:01 lpj0711

我解决了,是版本的问题. onnx-tensorrt-onnxrt protobuf-3.12.x TensorRT-7.2.2.3 cuda11.1

lpj0711 avatar Jan 19 '23 03:01 lpj0711

image

yangelaboy avatar Dec 14 '23 01:12 yangelaboy

could u tell me how to solve this problem? I've met the same issue in onnx-tensort 8.6GA

xuyuxiu83 avatar Jan 02 '24 04:01 xuyuxiu83