PaddleSeg icon indicating copy to clipboard operation
PaddleSeg copied to clipboard

[Feature Request] c++ inference?

Open antithing opened this issue 2 years ago • 7 comments

Hi! Thanks for making this code available.

It would be great to be able to run inference for segmentation and matting in c++, is this possible? Or planned for the future?

antithing avatar Jun 18 '22 16:06 antithing

Ah! I found this.

https://github.com/PaddlePaddle/PaddleSeg/tree/457af2edb51a3c85ac9fd019e564c4099b1ee8bd/deploy/onnxruntime_cpp

Will this work with the matting model?

Thanks!

antithing avatar Jun 18 '22 17:06 antithing

Yes, we have plan to support the C++ inference in the future.

The url can not be used directly. You can make changes as needed。

wuyefeilin avatar Jun 20 '22 02:06 wuyefeilin

Thank you! I have the demo running fine, but now i need to run the matting model.

I am using the tutorial here:

https://github.com/PaddlePaddle/PaddleSeg/blob/release/2.5/docs/deployment/inference/cpp_inference.md

and the model downloaded: pp-humanmatting-resnet34_vd, whish has the files:

image

When I run the tutorial code and pass the matting model, using the GPU I get:

ExternalError: CUDA error(719), unspecified launch failure.
  [Hint: Please search for the error code(719) on website (https://docs.nvidia.com/cuda/cuda-runtime-api/group__CUDART__TYPES.html#group__CUDART__TYPES_1g3f51e3575c2178246db0a94a430e0038) to get Nvidia's official solution and advice about CUDA Error.] (at ..\paddle\phi\backends\gpu\gpu_context.cc:624)

When i run it with CPU, I get:

----------------------
Error Message Summary:
----------------------
InvalidArgumentError: The type of data we are trying to retrieve does not match the type of data currently contained in the container. (at ..\paddle\phi\core\dense_tensor.cc:148)

What do i need to adjust in the cpp_inference code to run the matting model or the PP-HumanSeg-Server model? Thank you!

antithing avatar Jun 20 '22 09:06 antithing

You can refer the python deployment of matting to adjust the cpp_inference.

wuyefeilin avatar Jun 21 '22 02:06 wuyefeilin

Thank you, would you be able to point me at the code to change? It looks like this line:

output_t->CopyToCpu(out_data.data());

Is causing the issue. Where do i need to make a change?

antithing avatar Jun 21 '22 09:06 antithing

Leaving this here in case anyone else has the same issue. making the following changes allow you to use the matting models in c++.


 // Get output
      auto output_names = predictor->GetOutputNames();
      auto output_t = predictor->GetOutputHandle(output_names[0]);
      std::vector<int> output_shape = output_t->shape();
      int out_num = std::accumulate(output_shape.begin(), output_shape.end(), 1,  std::multiplies<int>());
      std::vector<float> out_data(out_num);
      output_t->CopyToCpu(out_data.data());
     
      // Get pseudo image
      std::vector<uint8_t> out_data_u8(out_num);
      for (int i = 0; i < out_num; i++) {
          out_data_u8[i] = static_cast<uint8_t>(out_data[i]);
      }
      cv::Mat out_gray_img(output_shape[2], output_shape[3], CV_8UC1, out_data_u8.data());`

antithing avatar Jun 22 '22 13:06 antithing

..although I think something is still wrong here, my matte looks incorrect, and lacks detail.

From this image:

image

The above code returns:

image

Changing to float type, and using the data from CopyToCPU directly, liek this:

cv::Mat out_gray_img2(output_shape[2], output_shape[3], CV_32FC1, out_data.data());

gives a better result, but it still seems strange:

image

@wuyefeilin, would you be able to let me know what I have missed? Thank you very much!

antithing avatar Jun 22 '22 16:06 antithing