skillful_nowcasting icon indicating copy to clipboard operation
skillful_nowcasting copied to clipboard

Pre-Train Model

Open jacobbieker opened this issue 3 years ago • 9 comments

This GAN is a bit tricky to train, and I've had trouble so far getting it to learn a lot. The loss just seems to stay very high. This could be because of the more limited computation that I've tried training it on compared to DeepMind, but getting a pre-trained model publicly available would be very helpful

jacobbieker avatar Sep 06 '21 13:09 jacobbieker

We also have their data we can get and download from the repo, so can first try their weights on their data and see if if works or not.

jacobbieker avatar Nov 11 '21 12:11 jacobbieker

The first pretrained weights are available! From being trained 100 epochs on the sample dataset, with future plans for weights from a model trained on the whole UK dataset and US dataset coming later.

jacobbieker avatar Jun 20 '22 07:06 jacobbieker

Hello, I'm sorry to bother you. My training process discriminator loss will remain at 2. Did you encounter this situation during the training of this network.

clearlyzero avatar Nov 11 '22 07:11 clearlyzero

Its been awhile since I did the training, but if I remember correctly, yes, somewhat? It should still bounce around, but the loss stays relatively high.

jacobbieker avatar Nov 11 '22 08:11 jacobbieker

Its been awhile since I did the training, but if I remember correctly, yes, somewhat? It should still bounce around, but the loss stays relatively high. Thanks,Does this mean that the minimum value of discriminator loss is 2

clearlyzero avatar Nov 11 '22 08:11 clearlyzero

No, but it shouldn't drop to 0 or near 0, otherwise the discriminator is, essentially, too good at discriminating and the generator won't be getting better. Unfortunately, it seems the best way to see how well the network is training is to plot some outputs every once in awhile

jacobbieker avatar Nov 11 '22 08:11 jacobbieker

Thank you very much for your reply. Will the loss of the generator increase and then balance with the training process

clearlyzero avatar Nov 11 '22 08:11 clearlyzero

Ideally, I think both the generator loss and discriminator loss will go up and down in training, the combined loss should go down a bit overall, but it won't go close to zero at all

jacobbieker avatar Nov 11 '22 08:11 jacobbieker

Thank you very much. I seem to have a clue. Thank you for replying to me in your busy work, which is very helpful to me

clearlyzero avatar Nov 11 '22 08:11 clearlyzero