key_lw
Results
6
comments of
key_lw
I guess one reason could be D was trained better than G, therefor since D works better, and G was not trained well, then the G loss goes high.
编译了修改的opencv 还是不行啊,,报相同的错误
@JEF1056 I tried use lbfgs to optimize the model in your repo, but it keep output something like this, and didn't make the model better. any idea? 0.0000D+00 0.0000D+00 0.0000D+00...
@peachis I get the same issue, have you solved the problem?
我用的就是百度云上的那个数据
Duplicate of #2 这里有啊