PU-Net icon indicating copy to clipboard operation
PU-Net copied to clipboard

Inference in C++

Open dattanand opened this issue 5 years ago • 1 comments

Hi, I want to load the trained model and run the inference in C++, I am trying to figure out a way to do this. What I could figure out is that I need to build TF with GPU support and generate c++ library to link against. What I dont know is, How Do I make sure the loaded graph will actually run on GPU? (It does when run in python). I am unsure about the custom-ops and whether do I need to do anything with them while inferring in c++. It might sound little irrelevant but If you have any pointers please guide

thanks

dattanand avatar May 06 '19 13:05 dattanand

Hi, the TF has the C++ interface and the TF operation is written by cuda. So one way may write the testing code in C++ with TF C++ API and call the cuda code. I have no experience and it is only my guess.

yulequan avatar May 09 '19 02:05 yulequan