BMW-TensorFlow-Inference-API-CPU icon indicating copy to clipboard operation
BMW-TensorFlow-Inference-API-CPU copied to clipboard

can't find models

Open Sparkxxx opened this issue 3 years ago • 0 comments

Hello there,

I'm getting started with ai and cv and I am trying to test the app but I have a hard time finding suitable models or configuring the application, both tensorflow and yolo.

What I've done so far beside building and running the container:

  1. Downloaded https://github.com/opencv/opencv/wiki/TensorFlow-Object-Detection-API models and pbtxt files, namely Inception-SSD v2 and MobileNet-SSD v3.
  2. Removed all other files from archive and changed names to frozen_inference_graph.pb, object-detection.pbtxt
  3. created config.json (not Config.json as stated in docs) with the following content: { "inference_engine_name": "tensorflow_detection", "confidence": 60, "predictions": 15, "number_of_classes": 2, "framework": "tensorflow", "type": "detection", "network": "mobilenet" }
  4. Ran the container and when trying to load models I get this error lab_tensorflow | a bytes-like object is required, not 'str' lab_tensorflow | Error loading model lab_tensorflow | INFO: 10.xx.0.xx:63174 - "GET /load HTTP/1.1" 200 OK Questions:
  5. What pretrained models are suitable and where to find them?
  6. What is the purpose of models_hash?
  7. What is / where to find the right config.json "network" value, is mine correct?

Thank you.

Sparkxxx avatar May 26 '21 10:05 Sparkxxx