pangafu

Results 5 comments of pangafu

@Kautenja Do you fix it? I have the same issue.

I think gpu worker and batchsize seperate maybe greater for stm komi, can you implement the stm komi to the newest code?

The offical branch search too wide when batchsize is large, and stm komi is not well training, many low pn search position will cause bad value, so maybe limit worker...

And in my test, in patch-39, when use offical weight, if increase worker number upper than 2 (such as 3), the gpu usage will increase, pos also increase, but can't...

Also in stm komi test, when I use 4 or 8 gpu, batchsize > 8 in offical branch stm komi code, the handicap capability is lower than 1 gpu, batchsize...