generative_inpainting icon indicating copy to clipboard operation
generative_inpainting copied to clipboard

Training on DEM SRTM dataset

Open htn274 opened this issue 4 years ago • 7 comments

Thank you for your great work. I have been applying your research to my problem now. Here is my problem:

DEM, stands for Digital Elevation Model, is a numerical matrix with each pixel represents its elevation correspond to a location.

SRTM is a globally DEM dataset. However, with a reservoir or a lake, SRTM showed the elevation of surface water at the time it collected (in 2000). Now I want to recover DEM below surface water of a lake or a reservoir.

Here is my step-by-step process:

  1. My training set is 15k DEM images.

  2. I generated masks correspond to input image. An input image has a specific mask. My mask is an connected area which has elevation lower than a random number. So it can describe a reservoir or a lake, I think so. Some of my masks with the input:

image

So I customized your code to read my input: image

image

  1. I also customized data_from_fnames in neuralgym toolkit to read '.tif' files. Then I used min-max normalize DEM image to 0-255 in order to pass it to your model.

  2. Some first rows in my train.flist:

image

  1. Because of limit memory of my GPU, I set image input shape (128, 128, 1) and batch size is 16 as your advice I saw in other issues.
#training
train_spe: 1000 
max_iters: 1000000
viz_max_out: 10
val_psteps: 500

static_view_size: 30
img_shapes: [128, 128, 1]
height: 128
width: 128
max_delta_height: 32
max_delta_width: 32
batch_size: 16
vertical_margin: 0
horizontal_margin: 0
  1. I finally trained more than 70 epoches and get bad results:

My losses are not converging

image

image

Some of generated validation images:

image

Questions

  1. Are all my customizations correct?
  2. I saw your sample flist is shuffled and I didn't. Does it affect to the training result?
  3. Could you give me some suggestion and your views on my problem?

Thanks a lot! <3

htn274 avatar Jun 29 '20 02:06 htn274

Hi htn274 This seems like a nice idea, how did it work out for you? I am trying to implement a similar idea, with non random masks, and it can be very helpful to learn from your experience.

Thanks in advance

Efrat-Taig avatar Jul 13 '20 11:07 Efrat-Taig

@htn274 Thanks for your detailed feedback and that's exactly our community needs! I will keep this issue in the front page.

Regarding your question:

  1. I think you are right, and the results are good already?
  2. Shuffling should always be non-worse.
  3. Larger dataset could help.

JiahuiYu avatar Aug 13 '20 12:08 JiahuiYu

Hi there, I'm wondering exactly how you modified the inpaint network to work with this - as I currently see it, the network only works with a mask of size (1, height, width,1), but with your custom masks you're feeding it (batch_size, height, width, 1). I'm trying to do the same thing but getting a "Dimensions must be equal" ValueError. Any guidance?

Nico-Adamo avatar Oct 07 '20 17:10 Nico-Adamo

Ah nevermind, I figured it out! For future reference, there's a lot more modifications than those shown to get custom masks working - but it's not too difficult. My code is a bit crazy right now but if anyone needs an explanation, feel free to ping me.

Nico-Adamo avatar Oct 07 '20 20:10 Nico-Adamo

Hey! I will be happy to hear your explanation. How can I contact you? Efrat ([email protected])

Efrat-Taig avatar Oct 08 '20 14:10 Efrat-Taig

How Exciting! This is almost exactly what I am trying to do!

@htn274 @Nico-Adamo @Efrat-Taig I am eager to hear how you did and exchange some results and other experiences! you can PM me via twitter e.g.

SvSz avatar Feb 06 '21 13:02 SvSz

As described in #444 you will have to change the batch size to 1 since one mask always services the whole batch. Also you will have to change the call in build_graph_with_losses to:

  x1, x2, offset_flow = self.build_inpaint_net(
      xin, mask, reuse=tf.AUTO_REUSE, training=training,
      padding=FLAGS.padding)

Baumflaum avatar Jan 05 '22 19:01 Baumflaum