Stefan Endres
Stefan Endres
Hello @arsenovic @hugohadfield We have high-dimensional applications in mind where we only make use of a subspace of grades. For example the Cl(n) Clifford algebra grows in dimensionality by `O(2^n)`...
Thank you for the reply. I made a new issue here #40 .
All sampling strategies should be deterministic, this is a regression that was introduced after switching to QMC that is partially fixed in https://github.com/scipy/scipy/pull/16313/files#diff-d87298173e00845c0e28dfa30ed635c9d83f174bc2ad35a88ba6af9efcc62922L637 changing `seed=np.random.RandomState())` to `seed=0)`
>Then it would be necessary to add `kwargs` to all global optimizers? ` At minimum the stopping criteria should be added (in the benchmark case at least `f_min` which is...
@andyfaff I agree with you entirely that hyperparameter tuning should be avoided, however, I don't consider stopping criteria to be hyperparameters because it does not change the performance of the...
>An algorithm that can go into an infinite loop for particular values of input arguments is a design that's ill-suited for non-expert users. This should not be the case for...
The module is available under a MIT license so it might be worth adapting into shgo https://github.com/erikvanzijst/interruptingcow
The equality constraints are not used in the global sampling step of shgo, they are passed to the local solver routine (therefore they will only work if the chosen local...
At a first glance it does appear that shgo is detecting the sub-domains of all (~121?) local minima, but that the [local minimisation](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html) ends in the wrong point. For example...
Hi @microprediction @fcela. In the most recent update (7e83bb8) I've added the `workers` argument for `shgo` to allow for basic parallelization. I would greatly appreciate any feedback and/or error reports...