javacpp-presets icon indicating copy to clipboard operation
javacpp-presets copied to clipboard

Added inference engine dnn model support

Open cansik opened this issue 4 years ago • 7 comments

As discussed in https://github.com/bytedeco/javacv/issues/1344 I have added the support for the Intels Deep Learning Inference Engine. The compile.sh has been extended by the guide of the wiki:

https://github.com/opencv/opencv/wiki/Intel's-Deep-Learning-Inference-Engine-backend#build-opencv-from-source

There is still an error to (with a workaround) on MacOS. But this seems to be OS specific and a problem of the new System Integrity Protection: https://github.com/bytedeco/javacv/issues/1344#issuecomment-559827548

cansik avatar Nov 29 '19 15:11 cansik

It's not a problem with security or anything, it's just Mac being annoying. We can apply these kind of workarounds after the build like this: https://github.com/bytedeco/javacpp-presets/blob/master/mxnet/cppbuild.sh#L184

saudet avatar Nov 29 '19 15:11 saudet

It looks like those flags don't do anything unless we have OpenVINO already installed. I can merge this, but the binaries distributed are not going to contain any inference engine. That's alright with you?

saudet avatar Nov 30 '19 11:11 saudet

Ok, maybe I need a bit more help here, the library structure is the following:

  • /opt/intel/openvino/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib
  • /opt/intel/openvino/deployment_tools/inference_engine/external/mkltiny_mac/lib/libmkl_tiny_tbb.dylib

Now libMKLDNNPlugin wants to read the rpath @rpath/libmkl_tiny_tbb.dylib, which means that we have to specify this path later in the main executable (which would be opencv), right?

Following your example, I would have to specify the rpath realtive to the libMKLDNNPlugin, or should I hand it over absolute?

And what does this stand for @loader_path/.?

install_name_tool -add_rpath @loader_path/. -id @rpath/libmkl_tiny_tbb.dylib ../../external/mkltiny_mac/lib/libmkl_tiny_tbb.dylib

cansik avatar Nov 30 '19 15:11 cansik

BTW, there's plenty of information about install_name_tool online, for example:

  • https://medium.com/@donblas/fun-with-rpath-otool-and-install-name-tool-e3e41ae86172
  • https://blogs.oracle.com/dipol/dynamic-libraries,-rpath,-and-mac-os
  • https://blog.krzyzanowskim.com/2018/12/05/rpath-what/
  • https://developer.apple.com/library/archive/documentation/DeveloperTools/Conceptual/DynamicLibraries/100-Articles/RunpathDependentLibraries.html
  • http://log.zyxar.com/blog/2012/03/10/install-name-on-os-x/
  • https://gitlab.kitware.com/cmake/community/wikis/doc/cmake/RPATH-handling

And more

saudet avatar Dec 16 '19 02:12 saudet

BTW, "nGraph has moved to OpenVINO", so once we get presets for OpenVINO in place, we can get rid of the old ones for nGraph. /cc @EmergentOrder

saudet avatar Nov 08 '20 00:11 saudet

@saudet Well with the flag it worked when you had the inference engine installed locally. This is at least something, but yeah of course, shipping the IE would be way better.

Do you think it would make more sense to create a preset for the inference engine itself? Have you already looked into that?

cansik avatar Mar 09 '21 11:03 cansik

Haven't looked into it, but the high-level API looks simple enough and shouldn't be too hard to map: https://github.com/openvinotoolkit/openvino/blob/master/inference-engine/samples/hello_classification/main.cpp

saudet avatar Mar 09 '21 11:03 saudet