Lyubov Yamshchikova
Lyubov Yamshchikova
Tuner now uses PipelineObjectiveEvaluate to evaluate metric while hyperparameters optimization. - Tuner uses the same metric function as composer - ```Pipeline.fine_tune_all_nodes``` is removed. To tune pipeline one can use ```PipelineTuner.tune```....
This version allows using both networkx 1 and networkx 2. Most changes are connected to formating. Main changes to allow networkx 2 usage are: - ```(G.subgraph(c).copy() for c in nx.connected_components(G))```...
Changes: - `SequentialTuner` searches near initial parameters just like `SimultaneousTunes` - `SequentialTuner` returnes the best obtained graph amonge ones obtained after each iteration of node tuning
- Mutation should be saved as a string to avoid problems with custom mutations serialisation - Bandits should store arm action mapping accordingly: key should be a string
Adds crossover suitable for graphs with cycles. Uses http://alglobus.net/NASAwork/papers/JavaGenes2/JavaGenesPaper.html
Add tuner for parameters optimisation based on [Optuna](https://optuna.org/). - Investigate Optuna abilities for parallelization and multi-objective optimization. - Compare Optuna with already implemented Hyperopt and IOpt tuners
Как я поняла, если задача не имеет дискретных параметров, возникает ошибка на этапе вывода результата.  