Eitan Richardson
Results
2
issues of
Eitan Richardson
Is it intentional that the D module (MappingToLatent) consists of three F.Linear layers w/o any activations (e.g. no ReLU / Leaky ReLU)? https://github.com/podgorskiy/ALAE/blob/5d8362f3ce468ece4d59982ff531d1b8a19e792d/net.py#L894
I've noticed self.m is not used. The following changes fixed my problem: self.m = H.shape[**0**] self.R = np.eye(self.**m**) if R is None else R Thanks.