examples
examples copied to clipboard
Updated the VAE example with some enhancements
Add some enhancements to the VAE example including some comments, a new way for calculating loss (i.e using mean instead of sum, and using mse instead of BCE) and finally several routines for creating visualizations concerning the VAE as other frameworks such as Keras, etc provide. Here is an example output :
Train Epoch: 50 [0/60000 (0%)] Loss: 142.077072
Train Epoch: 50 [1280/60000 (2%)] Loss: 151.412460
Train Epoch: 50 [2560/60000 (4%)] Loss: 143.169495
Train Epoch: 50 [3840/60000 (6%)] Loss: 139.854614
Train Epoch: 50 [5120/60000 (9%)] Loss: 140.479279
Train Epoch: 50 [6400/60000 (11%)] Loss: 141.113678
Train Epoch: 50 [7680/60000 (13%)] Loss: 145.222717
Train Epoch: 50 [8960/60000 (15%)] Loss: 143.298386
Train Epoch: 50 [10240/60000 (17%)] Loss: 142.074677
Train Epoch: 50 [11520/60000 (19%)] Loss: 142.046265
Train Epoch: 50 [12800/60000 (21%)] Loss: 140.014236
Train Epoch: 50 [14080/60000 (23%)] Loss: 144.403824
Train Epoch: 50 [15360/60000 (26%)] Loss: 141.879059
Train Epoch: 50 [16640/60000 (28%)] Loss: 139.471130
Train Epoch: 50 [17920/60000 (30%)] Loss: 143.647278
Train Epoch: 50 [19200/60000 (32%)] Loss: 153.391830
Train Epoch: 50 [20480/60000 (34%)] Loss: 142.550720
Train Epoch: 50 [21760/60000 (36%)] Loss: 147.450180
Train Epoch: 50 [23040/60000 (38%)] Loss: 149.538467
Train Epoch: 50 [24320/60000 (41%)] Loss: 143.521545
Train Epoch: 50 [25600/60000 (43%)] Loss: 142.479950
Train Epoch: 50 [26880/60000 (45%)] Loss: 150.655380
Train Epoch: 50 [28160/60000 (47%)] Loss: 142.734924
Train Epoch: 50 [29440/60000 (49%)] Loss: 145.209045
Train Epoch: 50 [30720/60000 (51%)] Loss: 148.113403
Train Epoch: 50 [32000/60000 (53%)] Loss: 150.475647
Train Epoch: 50 [33280/60000 (55%)] Loss: 145.669861
Train Epoch: 50 [34560/60000 (58%)] Loss: 147.789429
Train Epoch: 50 [35840/60000 (60%)] Loss: 149.124664
Train Epoch: 50 [37120/60000 (62%)] Loss: 141.129578
Train Epoch: 50 [38400/60000 (64%)] Loss: 145.054382
Train Epoch: 50 [39680/60000 (66%)] Loss: 145.409058
Train Epoch: 50 [40960/60000 (68%)] Loss: 142.284454
Train Epoch: 50 [42240/60000 (70%)] Loss: 148.351013
Train Epoch: 50 [43520/60000 (72%)] Loss: 143.500214
Train Epoch: 50 [44800/60000 (75%)] Loss: 151.315079
Train Epoch: 50 [46080/60000 (77%)] Loss: 145.327087
Train Epoch: 50 [47360/60000 (79%)] Loss: 142.971786
Train Epoch: 50 [48640/60000 (81%)] Loss: 140.635880
Train Epoch: 50 [49920/60000 (83%)] Loss: 145.872925
Train Epoch: 50 [51200/60000 (85%)] Loss: 134.699051
Train Epoch: 50 [52480/60000 (87%)] Loss: 146.803940
Train Epoch: 50 [53760/60000 (90%)] Loss: 146.638092
Train Epoch: 50 [55040/60000 (92%)] Loss: 132.822876
Train Epoch: 50 [56320/60000 (94%)] Loss: 139.588501
Train Epoch: 50 [57600/60000 (96%)] Loss: 139.583817
Train Epoch: 50 [58880/60000 (98%)] Loss: 149.037277
====> Epoch: 50 Average loss: 1.1320
====> Test set loss: 148.8819
latent space:
vae animation :

2d digits manifold

Acknowledgement : Calculating loss using mean (and mse) were inspired by Keras version of VAE Visualization related parameters and some snippets were inspired (and used) from this blog
Note: Tested successfully on both Win10 and on Linux (using Google Colab) using Pytorch 1.2.0
Hi, Yes BCE usually performs better it seems. I included this because it was also in the keras/tensorflow official repository, I just added it for the sake of completeness and so that users can experiment freely with both options at their disposal.
@Johnson-yue : Whats the status of this PR? Does it not add any value to the current sample we have? would appreciate if you could decide. Thanks in advance