Gradient-Free-Optimizers
Gradient-Free-Optimizers copied to clipboard
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
In the upcoming v1.1 I will release the Spiral Optimization algorithm. An explanation and visualization can be found on [wikipedia](https://en.wikipedia.org/wiki/Spiral_optimization_algorithm). The current source code of this new algorithm for Gradient-Free-Optimizers...
I looked into a way to add more acquisition functions for the sequence model-based optimization algorithms. In the current version 1.0 the only acquisition function is the expected improvement. Since...
Multiple users requested DIRECT optimizer in this package. This issue will track the process of its implementation and improvement.
In the upcoming v1.1 I will release the Lipschitz Optimizer. . An explanation and visualization can be found on [this blog](http://blog.dlib.net/2017/12/a-global-optimization-algorithm-worth.html). The current source code of this new algorithm for...
**Is your feature request related to a problem? Please describe.** Hi, I really like the look of this library and am interested in using it in the context of optimization...
The number of initial positions should be automatically increased if n_population increases. It will reduce the steps the user needs to take to change this parameter.
There appears to be a bug in the initial sampler of the sequence-model-based optimization algorithms that only occurs in the "test_large_search_space.py"-tests. This is the error: ```python gradient_free_optimizers/search.py:96: in search self.search_step(nth_trial)...
As a reference there is [Ant Colony Optimization by Marco Dorigo and Thomas Stützle](https://web2.qatar.cmu.edu/~gdicaro/15382/additional/aco-book.pdf).
In this issue I will show the progress of adding support for continuous parameter ranges in the search-space. For most optimization algorithms it should be easy to add support for...