Hello, I am testing with default data_src and dst.
When the model is trained, the phenomenon shown in the picture below appears. how can i solve it?
In the first time, the loss decrease. However after some time, the loss is over 10.

GPU : GTX 1080 ti
CPU : Intel(R) Core(TM) i5-6600 CPU @ 3.30GHz
RAM : 16G
driver version : 531.61(I used the latest version, but the same error occur, so I downgrade the version)
==================== Model Summary ====================
== ==
== Model name: new_SAEHD ==
== ==
== Current iteration: 12035 ==
== ==
==------------------ Model Options ------------------==
== ==
== resolution: 128 ==
== face_type: f ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: False ==
== uniform_yaw: False ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: n ==
== random_warp: True ==
== random_hsv_power: 0.0 ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: none ==
== clipgrad: False ==
== pretrain: False ==
== autobackup_hour: 0 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: True ==
== batch_size: 8 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==------------------- Running On --------------------==
== ==
== Device index: 0 ==
== Name: NVIDIA GeForce GTX 1080 Ti ==
== VRAM: 9.41GB ==
== ==