tensorflow-generative-model-collections icon indicating copy to clipboard operation
tensorflow-generative-model-collections copied to clipboard

Updates to VAE (and ops) to enable variable batch sizes

Open dribnet opened this issue 6 years ago • 0 comments

This is a great collection of generative models for TensorFlow, all nicely wrapped in a common class interface. I'd like to use this as a basis for ongoing work I'm migrating to TensorFlow. I'm interested in using this code not only to test MNIST models, but also as a way of generating a series of reference models using several other datasets which can be reused and shared.

So as a first proposed change, I'd like to separate the batch_size from the model definition to instead be a runtime variable by using a placeholder. This allows:

  1. a trained model can be later opened without knowing the batch_size used at training time
  2. the encoder/decoder can be called on x/z of variable length
  3. a trained model can be later refined via transfer learning at a different batch_size

I've done a quick version of this for the VAE model and verified that this still works on that model (at least on the latest TensorFlow) and enables (1) and (2) above. If you are open to the spirit of this change, I'm happy to rework the implementation if you'd like this cleaned up further.

dribnet avatar Apr 01 '18 12:04 dribnet