BMW-TensorFlow-Inference-API-CPU
BMW-TensorFlow-Inference-API-CPU copied to clipboard
can't find models
Hello there,
I'm getting started with ai and cv and I am trying to test the app but I have a hard time finding suitable models or configuring the application, both tensorflow and yolo.
What I've done so far beside building and running the container:
- Downloaded https://github.com/opencv/opencv/wiki/TensorFlow-Object-Detection-API models and pbtxt files, namely Inception-SSD v2 and MobileNet-SSD v3.
- Removed all other files from archive and changed names to frozen_inference_graph.pb, object-detection.pbtxt
- created config.json (not Config.json as stated in docs) with the following content:
{ "inference_engine_name": "tensorflow_detection", "confidence": 60, "predictions": 15, "number_of_classes": 2, "framework": "tensorflow", "type": "detection", "network": "mobilenet" }
- Ran the container and when trying to load models I get this error
lab_tensorflow | a bytes-like object is required, not 'str' lab_tensorflow | Error loading model lab_tensorflow | INFO: 10.xx.0.xx:63174 - "GET /load HTTP/1.1" 200 OK
Questions: - What pretrained models are suitable and where to find them?
- What is the purpose of models_hash?
- What is / where to find the right config.json "network" value, is mine correct?
Thank you.