optuna icon indicating copy to clipboard operation
optuna copied to clipboard

A hyperparameter optimization framework

Results 336 optuna issues
Sort by recently updated
recently updated
newest added

### What is an issue? Hi all, I have checked all the documentation and I cannot find any information about the inner mechanism of multi-objective optimization. I think it would...

document
stale

## Motivation: As part of #3815 ## Description: Suppressed user warnings for tests involving the _check_plot_args function in the tests/visualization_tests/test_slice.py file. ## Motivation ## Description of the changes

test

## Motivation To avoid the bugs related to version constraints. ## Description of the changes

stale

## Motivation https://github.com/optuna/optuna/issues/3021#issuecomment-1129545028: remove Python 3.6 tests from the CI of the integrations. Note that we keep using python 3.6 to test Optuna's core functionality. ## Description of the changes...

CI

## Motivation Relax the version constraint by https://github.com/optuna/optuna/pull/3950 for GPyTorch. BoTorch 0.7.0 was released last week. It starts to support GPyTorch>=1.9.0. however, it also drops Python __3.7__ support because GPyTorch...

CI
installation
stale

## Motivation: As part of #3815 ## Description: Suppressed Future warnings for tests involving the `_get_pareto_front_info` function.

test
stale

## Motivation As a part of https://github.com/optuna/optuna/issues/3815, this PR remove all expected warning messages and resolve unnecessary avoidable warning from distributed integration: chainermn and pytorch_distributed. ## Description of the changes...

test

## Motivation Fix #3978 ## Description of the changes In recent version of pytorch-lightning, `AcceleratorConnector` do not have attribute `distributed_backend`. Change `distributed_backend` to `_strategy_flag`

optuna.integration
stale

### Motivation Black-box optimization is inherently heuristic, and in many problems the algorithms don't work well out of the box but needs some hyper-hyper-parameter tuning. However, Optuna currently don't provide...

feature
needs-discussion

### Motivation I have a use case where I call a study from within the optimization function of another study. In the call within the function, I would like to...

feature
needs-discussion
stale