keras-io
keras-io copied to clipboard
RetinaNet
Hi, I found my dream in this repo! tf.keras implementation of retina in one jupyter! but I faced some facts! ...
- it seems that input pip is really cpu extensive so gpu usage is low...
- my custom dataset has this easy csv format, I don't know how to feed it in retinanet....
filename,width,height,class,xmin,ymin,xmax,ymax
/home/Dataset/17021.jpg,800,800,class2,402,358,446,418
/home/Dataset/17021.jpg,800,800,class2,349,348,375,410
/home/Dataset/17021.jpg,800,800,class2,352,438,383,513
/home/Dataset/17021.jpg,800,800,class2,399,408,462,485
/home/Dataset/17021.jpg,800,800,class2,361,566,389,641
/home/Dataset/22103.jpg,800,800,class2,81,337,116,355
@srihari-humbarwadi
+1 I would also like to see how custom dataset can be used with retinanet example.
Thanks @srihari-humbarwadi
I tried implementing code using tf.data.Dataset.from_generator and my own dataset with annotation in JSON format. But during training loss becomes Nan shortly after training starts.
I tried both opencv to load images (converting it to RGB and float32) and also tf.image.decode_image but no luck.
Hello @vijaygill, did you manage to find a solution for the Nan loss? I face the same problem.
Hello @vijaygill, did you manage to find a solution for the Nan loss? I face the same problem.
Hi @adavradou , unfortunately no, and I stopped trying after I posted my message above.
@elsa-1920 did you find out how we can load the datasets using the CSV format?
@adavradou did you find the solution? I also tried to load it from tf.data.Dataset.from_generator but got a NAN value in loss immediately. @srihari-humbarwadi any solution for this?
Hello @nikeshdevkota! Unfortunately I also found no solution..
Hi @elsa-1920,
For loading your own custom dataset you may use the API tf.keras.utils.image_dataset_from_directory which can generate a tf.data.Dataset from image files in a directory.
Please check this API and confirm whether it is useful for your case. Thanks!
This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you.
This issue was closed because it has been inactive for 28 days. Please reopen if you'd like to work on this further.