model_server icon indicating copy to clipboard operation
model_server copied to clipboard

How to Postprocess data from product-detection-100 openvino model while usinv ovms

Open NeethuES-intel opened this issue 7 months ago • 0 comments

Describe the bug I followed the vehicle detection example below to setup pipeline through ovms using openvino product-detection-100 model https://github.com/openvinotoolkit/model_server/blob/main/demos/vehicle_analysis_pipeline/python/vehicles_analysis_pipeline.py

I was able to preprocess data & pass it to inference, however I am unable to postprocess the output. Can I get some help with postprocessing.

Following is the output that I am receiving - [outputs { key: "868" value { dtype: DT_FLOAT tensor_shape { dim { size: 1 } dim { size: 1 } dim { size: 200 } dim { size: 7 } } tensor_content: "\000\000\000\000\000\000\000@=\370\223=\314l\371=\334\334\322=^\177\215>~{\n?\000\000\000\000\000\000@@\346\247u?X\2429?\320Q\312=\366\333_?h\006\370>\000\000\000\000\000\0000A\332\277\177?\0366\267>\036\253\314=\346\216\035?\016\365\025?\000\000\000\000\000\0000A\323\215c=\346\3169?t\263\334=l/?\373S\372>\000\000\000\000\000\000PAE\306u?\274\264\372=N\241\304=\031\014\216>\352A\n?\000\000\200\277\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\0`

To Reproduce Steps to reproduce the behavior:

  1. Steps to prepare models repository '...'
  2. OVMS launch command '....'
  3. Client command (additionally client code if not using official client or demo) '....'
  4. See error

Expected behavior A clear and concise description of what you expected to happen.

Logs Logs from OVMS, ideally with --log_level DEBUG. Logs from client.

Configuration

  1. OVMS version
  2. OVMS config.json file
  3. CPU, accelerator's versions if applicable
  4. Model repository directory structure
  5. Model or publicly available similar model that reproduces the issue

Additional context Add any other context about the problem here.

NeethuES-intel avatar Jun 27 '24 16:06 NeethuES-intel