openvino icon indicating copy to clipboard operation
openvino copied to clipboard

ReadNetwork API is failing while loading models from memory.

Open KashyapSalini opened this issue 2 years ago • 8 comments

System information

  • OpenVINO => 2021.4.752
  • Operating System / Platform => Ubuntu 20.04 LTS

--Detailed Description:-

I am using OpenVINO 2021.4 in my project. I want to use ReadNetwork() for loading models from memory and get CNNNetwork in return. I am able to create Tensordesc and blob, but ReadNetwork() is throwing error like:

unknown file: Failure C++ exception with description "base_inference plugin intitialization failed" thrown in the test body.

I have some ambiguity about blob and tensors i am creating. Because of that only i think it is throwing this error. One thing , I am checking size of tensor and blob created (using sizeof()) and it is not same as bin vector.

I tried to call ReadNetwork in many ways: I will share the code snippet how i am calling. I referred https://docs.openvino.ai/2021.4/openvino_docs_IE_DG_protecting_model_guide.html docs for my usecase.

Can you please help me on this. How can i resolve this bug.

//I will get this vectors after decrypting encrypted model files
std::vector <uint8_t> weights;
std::vector <uint8_t> model;
// weights and model is not empty here. 

std::string strModel(model.begin(), model.end());

InferenceEngine::Core ie;
        
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,
                                      InferenceEngine::make_shared_blob<uint8_t>({InferenceEngine::Precision::U8,
                                        {weights.size()}, InferenceEngine::C}, weights.data()));

AND

 InferenceEngine::TensorDesc O_tensor(InferenceEngine::Precision::U8,{weights.size()},InferenceEngine::Layout::ANY);
 std::cout<<"size of tensor"<<sizeof(O_tensor)<<std::endl;
 InferenceEngine::TBlob<uint8_t>::Ptr wei_blob = InferenceEngine::make_shared_blob<uint8_t>(O_tensor,&weights[0]);
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel, wei_blob);
```			
AND 
```		
 InferenceEngine::Blob::Ptr blobWts = InferenceEngine::make_shared_blob<uint8_t>({InferenceEngine::Precision::U8,
                                                      {weights.size()},InferenceEngine::Layout::C});
 blobWts->allocate();
std::memcpy(blobWts->buffer(), weights.data(), weights.size());
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,blobWts);
```			
AND		

InferenceEngine::TensorDesc bindesc(InferenceEngine::Precision::U8, {weights.size()}, InferenceEngine::Layout::C);

InferenceEngine::Blob::Ptr blobWts = InferenceEngine::make_shared_blob<uint8_t>(bindesc, weights.data()); InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel, blobWts);

 AND
```										
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,InferenceEngine::make_shared_blob<uint8_t> 
                                                                    ({InferenceEngine::Precision::U8,{weights.size()},InferenceEngine::C }, weights.data()));
							

KashyapSalini avatar Sep 12 '22 05:09 KashyapSalini

@Thunder-29 As your description, you have tried 4 ways to read network, but all of them failed, does my understanding right? Have you tried non-encrypted models, did all reproduce the same issue? I don't have encrypted models, let's me first try to reproduce with non-encrypted models.

@jgespino is any reason that this issue is related with C API? I cannot see it, could you clarify it?

riverlijunjie avatar Sep 13 '22 00:09 riverlijunjie

@riverlijunjie Yeah, I tried in 5 ways but all of them are failing. I tried with non-encrypted model , it was failing. Earlier it was throwing like this

unknown file: Failure C++ exception with description "Failed to construct OpenVINOImageInference" thrown in the test body.

Now I am getting:

unknown file: Failure C++ exception with description "base_inference plugin intitialization failed" thrown in the test body.

I have observed above two errors.

KashyapSalini avatar Sep 13 '22 03:09 KashyapSalini

@Thunder-29 Hi, do some test on simply sample "hello_classification with "OpenVINO version 2021.4, can not reproduce this issue. The following is my process:

  1. Code change:
        // 1) read .xml to memory
        std::vector<uint8_t> model_xml;
        std::ifstream is_xml("/home/odt/xuejun/test-models/alexnet.xml", false ? std::ifstream::binary | std::ifstream::in : std::ifstream::in);
        if (is_xml) {
            is_xml.seekg(0, std::ifstream::end);
            model_xml.resize(is_xml.tellg());
            if (model_xml.size() > 0) {
                is_xml.seekg(0, std::ifstream::beg);
                is_xml.read(reinterpret_cast<char *>(&model_xml[0]), model_xml.size());
            }
        }
        // 2) read .bin to memory
        std::vector<_Float32> model_bin;
        std::ifstream is_blob("/home/odt/xuejun/test-models/alexnet.bin", true ? std::ifstream::binary | std::ifstream::in : std::ifstream::in);
        if (is_blob) {
            is_blob.seekg(0, std::ifstream::end);
            model_bin.resize(is_blob.tellg());
            if (model_bin.size() > 0) {
                is_blob.seekg(0, std::ifstream::beg);
                is_blob.read(reinterpret_cast<char *>(&model_bin[0]), model_bin.size());
            }
        }
        // 3) create a blob tensor based readed .bin memory
        TensorDesc tensor(Precision::FP32, {model_bin.size()}, Layout::C);
        auto bin_blob = make_shared_blob<float>(tensor, model_bin.data(), model_bin.size());
        // 4) useing func "ReadNetwork" get "CNNNetwork"
        CNNNetwork network = ie.ReadNetwork(std::string(reinterpret_cast<const char *>(model_xml.data()),
            reinterpret_cast<const char *>(model_xml.data() + model_xml.size())), bin_blob);
       // 4) cancel the original way about read network
       // CNNNetwork network = ie.ReadNetwork("/home/odt/xuejun/test-models/alexnet.xml");

  1. Command run:

./intel64/Release/hello_classification ~/xuejun/test-models/alexnet.xml ~/xuejun/images/00001.jpg CPU Can get the same result with the old version.

Top 10 results:

Image /home/odt/xuejun/images/00001.jpg

classid probability
------- -----------
584     0.5264589
783     0.2139807
902     0.0591708
464     0.0444336
677     0.0171803
488     0.0169180
855     0.0151906
68      0.0090135
66      0.0086330
36      0.0052825

The attached file is the code changes. read_from_memory_hello_classification.txt

zhaixuejun1993 avatar Sep 14 '22 01:09 zhaixuejun1993

@Thunder-29 any new update?

riverlijunjie avatar Sep 15 '22 08:09 riverlijunjie

@riverlijunjie @zhaixuejun1993 I am really sorry for delay. I tried the above solution for "hello_classification" sample, It gave error as "Cannot load library '/opt/intel/openvino_2021/deployment_tools/inference_engine/lib/intel64/libinference_engine_onnx_reader.so': libprotobuf.so.3.7.1.0: cannot open shared object file: No such file or directory"

Few days ago, when i was calling ReadNetwork for loading models from memory from my application, I got error with libprotobuf as [libprotobuf FATAL external/com_google_protobuf/src/google/protobuf/stubs/common.cc:86] This program was compiled against version 3.7.1 of the Protocol Buffer runtime library, which is not compatible with the installed version (3.9.2). Contact the program author for an update. If you compiled the program yourself, make sure that your headers are from the same version of Protocol Buffers as your link-time library. (Version verification failed in "/home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu20/b/build/dldt/_deps/ext_onnx-build/onnx/onnx_ngraph_onnx-ml.pb.cc".) terminate called after throwing an instance of 'google::protobuf::FatalException' what(): This program was compiled against version 3.7.1 of the Protocol Buffer runtime library, which is not compatible with the installed version (3.9.2). Contact the program author for an update. If you compiled the program yourself, make sure that your headers are from the same version of Protocol Buffers as your link-time library. (Version verification failed in "/home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu20/b/build/dldt/_deps/ext_onnx-build/onnx/onnx_ngraph_onnx-ml.pb.cc".) Aborted (core dumped)

To solve this error I downloaded the protobuf version 3.9.2 and installed on my system and replaced libprotobuf.so.3.7.1.0 file in /opt/intel/openvino_2021/deployment_tools/ngraph/lib folder by libprotobuf version 3.9.2.

Again I kept libprotobuf.so.3.7.1.0 file in /opt/intel/openvino_2021/deployment_tools/ngraph/lib, now this directory have libprotobuf.so libprotobuf.so.20 libprotobuf.so.20.0.2 libprotobuf.so.3.7.1.0 files. OpenVINO sample hello_classification is working fine now.

But when i am calling ReadNetwork to load models from memory from my application, It is throwing the same error with libprotobuf as:

[libprotobuf FATAL external/com_google_protobuf/src/google/protobuf/stubs/common.cc:86] This program was compiled against version 3.7.1 of the Protocol Buffer runtime library, which is not compatible with the installed version (3.9.2). Contact the program author for an update. If you compiled the program yourself, make sure that your headers are from the same version of Protocol Buffers as your link-time library. (Version verification failed in "/home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu20/b/build/dldt/_deps/ext_onnx-build/onnx/onnx_ngraph_onnx-ml.pb.cc".) terminate called after throwing an instance of 'google::protobuf::FatalException' what(): This program was compiled against version 3.7.1 of the Protocol Buffer runtime library, which is not compatible with the installed version (3.9.2). Contact the program author for an update. If you compiled the program yourself, make sure that your headers are from the same version of Protocol Buffers as your link-time library. (Version verification failed in "/home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu20/b/build/dldt/_deps/ext_onnx-build/onnx/onnx_ngraph_onnx-ml.pb.cc".) Aborted (core dumped)

Thank you

KashyapSalini avatar Sep 20 '22 10:09 KashyapSalini

@Thunder-29 It seems that it cannot find the correct protobuf path, does your application dynamic link another different libprotobuf?

riverlijunjie avatar Sep 20 '22 11:09 riverlijunjie

Can you check the linked libraries via e.g.

ldd ./my_application

If your application loads dynamic libraries at runtime "manually" (e.g. with dlopen()) you might want to try this:

LD_DEBUG=libs ./my_application

And it will log which libraries were requested to be loaded, where the dynamic-linker will search for and which libraries finally where loaded and initialized.

brmarkus avatar Sep 21 '22 05:09 brmarkus

If one application dynamic links 2 different version libprotobuf, there will be conflict. So we should use single version libprotobuf in single application.

riverlijunjie avatar Sep 21 '22 12:09 riverlijunjie

@Thunder-29 could you try the same protobuf version with openvino?

riverlijunjie avatar Sep 27 '22 07:09 riverlijunjie

@riverlijunjie @brmarkus My application loads dynamic libraries at runtime. I checked the linked libraries but I didn't get libprotobuf there. I have attached the log file of ldd ./application. One thing I want to mention that, I am using libtensorflow-cpu-linux-x86_64-2.7.0 and before running the testcase, I will export the tensorflow library path. I read somewhere that Tensorflow will require libprotobuf.

  1. OpenVINO 2021.4 requires libprotobuf.so.3.7.1.0
  2. Tensorflow 2.7.0 from my application requires libprotobuf.so.3.9.2 (Not sure)

I checked the libprotobuf version installed on my system using protoc --version, It was 3.6.1. I changed it to 3.7.1.0 (I thought that application will take libprotobuf installed on my system ) and rebuilt my application. But still it is throwing same error.

I tried to keep protobuf 3.9.2 everywhere, installed on system, replaced libprotobuf.so file inside OpenVINO. Rebuilt application, this time I am getting unknown file: Failure C++ exception with description "base_inference plugin intitialization failed" thrown in the test body.

Also after changing OpenVINO protobuf version to 3.9.2, I am unable to run the sample application hello_classification , It is throwing error "Cannot load library '/opt/intel/openvino_2021/deployment_tools/inference_engine/lib/intel64/libinference_engine_onnx_reader.so': libprotobuf.so.3.7.1.0: cannot open shared object file: No such file or directory"

logs.txt

KashyapSalini avatar Sep 27 '22 08:09 KashyapSalini

Maybe you can try export the protobuf path into LD_LIBRARY_PATH.

riverlijunjie avatar Sep 27 '22 11:09 riverlijunjie

@riverlijunjie I tried exporting the protobuf path into LD_LIBRARY_PATH but still it is failing. I uninstalled protobuf libraries , installed earlier but still it is throwing same error. Again I have installed protobuf 3.7.1 on my system by following few steps as: Downloaded the 3.7.1 tarball from https://github.com/protocolbuffers/protobuf/releases. unzip the folder cd protobuf-3.7.1 ./configure make make check sudo make install

Now I am getting : protoc --version : libprotoc 3.7.1 which protoc : /usr/local/bin/protoc

I didn't get any libprotobuf.so file in /usr/lib/x86_64-linux-gnu directory.

For both pip show protobuf & pip3 show protobuf I am getting Name: protobuf Version: 4.21.7 Summary: Home-page: https://developers.google.com/protocol-buffers/ Author: [email protected] Author-email: [email protected] License: 3-Clause BSD License Location: /home/intel/.local/lib/python3.8/site-packages Requires: Required-by: grpcio-tools, onnx, tensorboard, tensorflow

From my application side I am unable to replace the protobuf version from 3.9.2 to 3.7.1 and fix it.

I tried executing ReadNetwork for loading models from memory inside hello_classification sample in different scenarios. I got libprotobuf.so & libprotobuf.so.3.7.1.0 with libngraph.so, libonnx_importer.so and libonnx_proto.so in /opt/intel/openvino_2021/deployment_tools/ngraph/lib directory (i.e. OpenVINO installation directory).

I found that ReadNetwork API for loading models from memory will use libonnx_importer.so & libonnx_proto.so libraries with others. ldd ./libonnx_importer.so linux-vdso.so.1 (0x00007ffea6f4f000) libngraph.so => /opt/intel/openvino_2021/deployment_tools/ngraph/lib/libngraph.so (0x00007f15ccb00000) libonnx_proto.so => /opt/intel/openvino_2021/deployment_tools/ngraph/lib/libonnx_proto.so (0x00007f15ccaa1000) libprotobuf.so.3.7.1.0 => /opt/intel/openvino_2021/deployment_tools/ngraph/lib/libprotobuf.so.3.7.1.0 (0x00007f15cc79b000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f15cc5a0000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f15cc44f000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f15cc434000) libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f15cc411000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f15cc21f000) /lib64/ld-linux-x86-64.so.2 (0x00007f15cd31c000)

ldd ./libonnx_proto.so linux-vdso.so.1 (0x00007ffe903e7000) libprotobuf.so.3.7.1.0 => /opt/intel/openvino_2021/deployment_tools/ngraph/lib/libprotobuf.so.3.7.1.0 (0x00007f9fff6a2000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f9fff4a7000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f9fff48c000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f9fff29a000) libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f9fff275000) /lib64/ld-linux-x86-64.so.2 (0x00007f9fffa09000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f9fff126000)

Replacement of libprotobuf.so.3.7.1.0 by 3.9.2 Is throwing error. I need help how can I regenerate this two libraries by keeping protobuf 3.9.2 or any solution for this situation.

Thank you

KashyapSalini avatar Oct 12 '22 11:10 KashyapSalini

Hi @Thunder-29 you have this problems, because in one component you have dependencies on two different protobuf versions. First of all let me ask you: Can you migrate to the latest OpenVINO? In the latest version we use static dependency on protobuf, and it means that you shouldn't have such problems in your application.

ilyachur avatar Oct 13 '22 06:10 ilyachur

Hi @ilyachur , For now it is not possible to move to latest OpenVINO. I tried this option but didn't worked for me.

KashyapSalini avatar Oct 13 '22 15:10 KashyapSalini

Hi @Thunder-29 , in this case looks like the issue can be solved only by changing the protobuf version and building or releasing the new OV version.

But any way, can you maybe try to run your application on clean environment? Will it help?

ilyachur avatar Oct 14 '22 07:10 ilyachur

@Thunder-29 Can you manually rebuild OpenVINO 2021.4 with custom protobuf version? The 2021.4 contains this protobuf limitations, you can use 2022 release version where we changed a way how we are using protobuf or you can manually rebuild OpenVINO 2021.4 version with updated protobuf version. I am not sure that we will upgrade protobuf version in 2021.4 releases because it may affect other users. Maybe also can you try to run OpenVINO application on clear environment where you don't have TF?

ilyachur avatar Oct 14 '22 08:10 ilyachur

@ilyachur Yeah, I want to rebuild OpenVINO 2021.4 with custom protobuf version. I need help on this. Like I have installed OpenVINO on my machine from the .tgz file by following the steps from official webpage.

TF is needed for my application, so without that I won't be able to proceed further.

KashyapSalini avatar Oct 14 '22 08:10 KashyapSalini

@Thunder-29 you can follow this wiki to build openvino: https://github.com/openvinotoolkit/openvino/wiki#how-to-build but I'm not sure whether it still works for 2021.4. If any build issue please ping me.

riverlijunjie avatar Oct 14 '22 09:10 riverlijunjie

Thanks @riverlijunjie I will do this and update here.

KashyapSalini avatar Oct 14 '22 09:10 KashyapSalini

@Thunder-29 BTW, you also can use legacy IE API for 2022 OpenVINO. It means that in case of migration to newer OpenVINO version you shouldn't change your application. I would like to highlight it because I don't know the reason why you cannot migrate to the latest OpenVINO, maybe if the reason is a migration to new API, I just would like to say that it is not mandatory to update application to OpenVINO 2.0 API.

ilyachur avatar Oct 14 '22 09:10 ilyachur

Okay @ilyachur I will try this option as well.

KashyapSalini avatar Oct 14 '22 10:10 KashyapSalini