tensorflow-yolov4-tflite icon indicating copy to clipboard operation
tensorflow-yolov4-tflite copied to clipboard

save and load model on GPU

Open mops1112 opened this issue 3 years ago • 5 comments

Hi tf2.3.0 is working on CPU and GPU but it is slow to detect on yolov4-tiny-416. So, I move to use tf2.7 instead it works on CPU but GPU does not work. it detects only the first frame. the next frame does not detect. I feel it may be on save_model.py has a problem.

I load a model by this

yolo4_weight = '.\checkpoints\yolov4-tiny-416'
saved_model_loaded =  tf.saved_model.load(yolo4_weight, tags=[tag_constants.SERVING])
infer = saved_model_loaded.signatures['serving_default']

mops1112 avatar Jan 13 '22 03:01 mops1112

Same issue here. Any solutions?

PuneethBC avatar Feb 10 '22 10:02 PuneethBC

I found a solution. If you run the save_model.py on a machine with CPU, the generated model fails to run on a video in GPU mode. It basically gives valid output only for the first frame. So, essentially you will have to reload the model every frame. If save_model.py is run on a GPU machine with CUDA devices made visible, it will run on both CPU and GPU without any issues. I have tested it in both Windows and Ubuntu and it works.

PuneethBC avatar Feb 10 '22 13:02 PuneethBC

PuneethBC

How did you do?? I add os.environ["CUDA_VISIBLE_DEVICES"] = '0' in save_models but it doesn't work

larry3425527 avatar Jun 24 '22 11:06 larry3425527

Did not work for me. The second inference always invalid.

Is there a solution for this? I tried tensorflow 2.4 -> 2.9, no joy

lifgren avatar Jul 27 '22 22:07 lifgren

Reading 'https://githubmemory.com/repo/google/automl/issues/896', there is a known issue with TF 2.2.0 that was fixed in 2.3.1

Worked for me: pip install tensorflow-gpu==2.3.1

lifgren avatar Jul 27 '22 22:07 lifgren