Semantic-Segmentation-Tensorflow-Eager
Semantic-Segmentation-Tensorflow-Eager copied to clipboard
Move to Dataset API
Change the custom loader your are using for the Dataset API. Keep the same functionalities like data augmentation. Change the rest of the code to integrate it. https://www.tensorflow.org/tutorials/eager/eager_basics#datasets https://www.tensorflow.org/performance/datasets_performance https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/contrib/eager/python/examples/generative_examples/image_captioning_with_attention.ipynb
Once the Dataset API is integrated, move into the TFrecords data format http://warmspringwinds.github.io/tensorflow/tf-slim/2016/12/21/tfrecords-guide/
Remember that pre-processed filestorage is a double-edged sword. Generally it is faster but realtime augmentation, if performant enough, could help you to change augmentation hypotesis on the fly without to wait for pre-processe again a large dataset.
Starting to code it: ef739ba2a6f042506aa7e873c9d6aaaac9a75037 I want to learn about this APIs (I hope Tensorflow 2.0 will maintain them), so I will take my time to do it well, understand it and code it :)