noman-anjum-retro
noman-anjum-retro
@narendasan please find compiled module [here](https://drive.google.com/file/d/1GrgS-ct1hCGI7DIzAnVnZmr1wJI8ge7K/view?usp=sharing). This was compiled with tensorrt 8.4.3
any success with it @LukeRoss00 ?
Hey thanks for replying, where can I find libtorchtrt_runtime.so . I searched It in libtorch repo and torch_tenosrRT repo. Also this method will run on window right?
I got this file libtorchtrt_runtime.so from linux source. But it's not getting loaded and throwing following error Directory from where I got the file: python3.8/site-packages/torch_tensorrt/lib/libtorchtrt_runtime.so ` File "\video_play.py", line 189,...
Hello @andreabonvini I setup the code and fixed all compile time errors, but now I'm getting weird linking errors for each access of torch_tensorrt:: in my code. Can you please...
Hello @andreabonvini. I started with the basic cmake: ``` cmake_minimum_required(VERSION 3.0 FATAL_ERROR) project(custom_ops) find_package(Torch REQUIRED) add_executable(example-app example-app.cpp) target_link_libraries(example-app "${TORCH_LIBRARIES}") set_property(TARGET example-app PROPERTY CXX_STANDARD 14) ``` I run CMAKE with command...
Thanks I'll take a look to it
How did you build torch_tensorrt on windows can you please explain it. When I'm running bazel build for default workspace file it throws an error `bazel build //:libtorchtrt --compilation_mode opt...
Alright, that makes sense now. Thanks Alot
Hey @andreabonvini I tried compiling with above mentioned steps. Code runs well with this script: `int main(){ std::string model_path = "path/to/trt_script_module.pt"; const torch::Device device = torch::Device(torch::kCUDA, 0); torch::jit::script::Module model; try...