skeletonli

Results 2 comments of skeletonli

Thank you for your reply. I had tried to set batch_size to 64 and even 32, but it still get error. I found than the problem appear in the code...

I tried to use : config = tf.ConfigProto() config.gpu_options.per_process_gpu_memory_fraction = 0.9 config.gpu_options.allow_growth=True config.log_device_placement=True Althought the Gpu memory use less, but when runing eval, it still crash , shows omm. Thu...