ImageDenoisingGAN icon indicating copy to clipboard operation
ImageDenoisingGAN copied to clipboard

Memory leak

Open SugarShine opened this issue 6 years ago • 6 comments

Thanks for your shared code. When I train the model on my own datasets, I found memory leak when running code :

training_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), training_batch)) groundtruth_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), groundtruth_batch))

and after some iteration, when saving checkpoints, the graphdef is larger than 2GB, program crashing. Does anyone meet this issue and how to solve it?

SugarShine avatar May 17 '18 03:05 SugarShine

Yes, there is a memory leak. I'm actually working on a new version but it's not quite ready yet but you can try replacing the above code with your own normalization function.

manumathewthomas avatar May 17 '18 05:05 manumathewthomas

@manumathewthomas Thanks so much for your reply, I have solved the memory leak issue. But when I train the model, the loss is always NaN, could you give me some advice or a new version?

SugarShine avatar May 25 '18 02:05 SugarShine

try reducing the learning rate

manumathewthomas avatar May 25 '18 05:05 manumathewthomas

How do you solve this problem, can you share it?

houguanqun avatar Jul 06 '18 09:07 houguanqun

I also got the loss = NaN with my dataset when training even reducing my learning rate? Can you give a advice?

codaibk avatar Aug 06 '18 16:08 codaibk

Thanks for your shared code. When I train the model on my own datasets, I found memory leak when running code :

training_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), training_batch)) groundtruth_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), groundtruth_batch))

and after some iteration, when saving checkpoints, the graphdef is larger than 2GB, program crashing. Does anyone meet this issue and how to solve it?

  hi,When I train the model with CPU, I encountered the same problem "GraphDef cannot be larger than 2GB.". How did you solve it?
  And how do you replace the dataset with your own? I don't know how to process the data.
  Thank u very much ,I am a beginner, there are many problems don't understand,I hope I'm not disturbing you.if u're Chinese,can i add your wechat?

wish829 avatar Dec 27 '18 14:12 wish829