Tianming

Results 2 issues of Tianming

[Here](https://github.com/10-zin/Synthesizer/blob/master/synth/synthesizer/layers.py#L31) the synthesizer decoder uses dot-product self-attention for encoder-decoder attention. Is that correct?

Thank you for your nice works! I failed to run the codes, there is no implementation regarding 'forward_train' function in mgan.py. However, in the original 'mmdetection' toolbox, they all define...