`allow_growth` doesn't seem to work
The python documentation mentions an option to prevent tensorflow of allocating all the GPU memory at once (https://www.tensorflow.org/tutorials/using_gpu#allowing_gpu_memory_growth).
The equivalent julia code doesn't seem have any effect. The following still allocates the whole GPU according to nvidia-smi
config = TensorFlow.tensorflow.ConfigProto()
config.gpu_options = TensorFlow.tensorflow.GPUOptions()
config.gpu_options.allow_growth = true
session = Session(Graph(), config)
Any idea what is going on? Could it be that I am just misunderstanding the intend of allow_growth ?
You are right about the intended usage, and it should work. I'll check it out.
Because Session(allow_growth=true) works, here is how this method does it:
config = TensorFlow.tensorflow.ConfigProto()
gpu_options = TensorFlow.tensorflow.GPUOptions()
ProtoBuf.set_field!(gpu_options, :allow_growth, true)
ProtoBuf.set_field!(config, :gpu_options, gpu_options)
Session(Graph(), config)
The big difference being the ProtoBug.set_field!
Oh yes, that's a limitation of ProtoBuf.jl and ultimately the lack of dot-overloading in Julia. So now I remember why I wrote the convenience allow_growth keyword option in the first place.
Personally I think this workaround is "good enough". Feel free to close this issue.
Let's document it somewhere before closing.