mobile-deeplab-v3-plus
mobile-deeplab-v3-plus copied to clipboard
using deeplab on ios application
@nolanliou Hi, I have trained deeplab on my custom dataset(200*150) with 224 as crop size and during the test, it detects for crop with crop size 224 . now what I need is to integrate my model on ios application, i was able to successfully convert the model to tflite .but it doesn't detect anything i don't get it whats the problem because when i tried to convert a deeplab pretrained mobilenet that you have mentioned it works for me on mobile and for my model no ,however, i have tested my model (.pb model)with python code and it detects did you have any idea??
- convert to tflite model all right?
- make sure pre-process and post-processing codes are right.
@nolanliou
1)converting to tflite format :
tflite_convert ----output_format=TFLITE --inference_type=FLOAT --inference_input_type=FLOAT --input_arrays=sub_2 --input_shapes=1,224,224,3 --output_arrays=ResizeBilinear_2 --output_file=/Users/Karizma/Downloads/deeplabv3_mnv2_pascal_trainvall/frozen-224.tflite --graph_def=/Users/Karizma/Downloads/deeplabv3_mnv2_pascal_trainvall/frozen-224.pb --mean_values=128 --std_dev_values=127 --allow_custom_ops --post_training_quantize
2) pre-process and post-processing codes: i referred to this projet
https://github.com/toniz/deeplab-on-ios
excuse my ignorance, i'm new with ios stuff
what i don't get it that
1/the same swift code works with another model like https://storage.googleapis.com/download.tensorflow.org/models/tflite/gpu/deeplabv3_257_mv_gpu.tflite
but when i put my model it doesn't work
2/my model detects fine on inference python code