Algorithms-for-Optimization-Python
Algorithms-for-Optimization-Python copied to clipboard
Unofficial implementation in Python porting of the book "Algorithms for Optimization" (2019) MIT Press by By Mykel J. Kochenderfer and Tim A. Wheeler
Algorithms for Optimization - Python
Unofficial implementation in Python porting of the book "Algorithms for Optimization" (2019); MIT Press by Mykel J. Kochenderfer and Tim A. Wheeler
Contents
02. Derivatives in Multiple Dimensions
- Symbolic differentiation
- Numerical differentiation
- Finite difference methods
- Complex step method
- Automatic differentiation
- Forward accumulation
- Reverse accumulation
03. Bracketing
- Unimodality assumption
- Fibonacci search
- Golden section search
- Quadratic fit search
- Shubert-Piyavskii method
- Bisection method
04. Local Descent
- Line search
- Approximate line search
- First Wolfe condition
- Second Wolfe condition
- Strong Wolfe condition
- Trust region method
05. First-order Methods
- Gradient descent
- Conjugate gradient descent
- Momentum
- Nesterov momentum
- Adagrad
- RMSProp
- Adadelta
- Adam
- Hypergradient descent
- Hypergradient Nesterov momentum
- Implemented by: Seok-Ju Hahn (vaseline555)
- Email: [email protected]