Karlson Pfannschmidt
Karlson Pfannschmidt
Hi @michaelosthege, thank you for the reference. One of my near future todos was to support parallel evaluation of configurations, which are not simply implemented using a constant liar strategy....
In your case I would look at quadrature fourier features to implement the Thompson sampling part: https://papers.nips.cc/paper/8115-efficient-high-dimensional-bayesian-optimization-with-additivity-and-quadrature-fourier-features I found an implementation here: https://github.com/Mojusko/QFF Use it in conjunction with predictive variance...
I will consider this request as part of the frequent request for multi-GPU parallelization. There should be a way to handle both use-cases.
> What would be the most sensible #rounds for, say 60 cores? Presumably some multiple of 60? Great tool, btw Since each round consists of 2 games (with the same...
Since cutechess-cli is run using random openings, it should not wait for a complete batch to be completed. Let me know if this is not the case.
I will investigate how difficult this would be to implement. As a first step, it should be useful to make the position-tester be installable by pip.
I agree that behavior looks peculiar. A few observations: - The overall (residual) noise level is quite small. - The signal variance decreases slightly, but is still quite high. -...
Thank you for the experiments, I think we can definitely try to optimize the default parameters. Currently, they appear to favor underfitting models a bit too much. > But I...
> Thanks, do you suggest some different values or some more parameters than the 0.05-0.3 length-scale prior bounds I tried? > Do you think inverse gamma distribution is probably better...
Yes, the `d` values are the bounds and the `a` values are the steepness values.