NumericalOptimization_BasicAlgorithm
                                
                                 NumericalOptimization_BasicAlgorithm copied to clipboard
                                
                                    NumericalOptimization_BasicAlgorithm copied to clipboard
                            
                            
                            
                        Optimization Algorithm: convex optimization; numerical optimization
Optimization Basic Algorithm
Introduction:
Save my optimization code demo: convex optimization; numerical optimization algorithm
note: code based on cvxpy package
my notes of optimization:
Project struct
Linear Search Methods:
Steepest Descent Method
Newton Method
Quasi-Newton Method
Damped-Newton Method
Conjugate Gradient Method
Matrix Util Method
Large-Scale Unconstrained Optimization:
Inexact Newton method
Calculating Derivatives:
Finite-Difference Derivative Approximations
Automatic Differentiation
Algorithm list
Linear Search Methods :
StepLength:
{ Backtracking Line Search } Algorithm: BacktrackingLineSearch.py
{ Interpolation: Quadratic; Cubic} Algorithm: Interpolation.py
{ Zoom} Algorithm: Zoom.py
{ Wolfe Line Search-low dimension} Algorithm: WolfeLineSearch.py
{ Wolfe Line Search-high dimension} Algorithm: WolfeCondition.py
Steepest Descent:
{ Gradient Descent Method } Algorithm: GradientDescentMethod.py
Newton:
{ Newton Method } Algorithm: NewtonMethod.py
{ Cholesky with Added Multiple of the Identity } Algorithm: AddedMultipleOfTheIdentity.py
Quasi-Newton:
{ DFP Method } Algorithm: DFP.py
{ BFGS Method } Algorithm: BFGS.py
Damped-Newton:
{ Damped Newton Method } Algorithm: DampedNewtonMethod.py
Conjugate Gradient:
{ Conjugate Gradient Preliminary } Algorithm: CG_Preliminary.py
{ Conjugate Gradient } Algorithm: CG.py
{ Preconditioned Conjugate Gradient } Algorithm: Preconditioned_CG.py
{ Fletcher-Reeves methods } Algorithm: FR.py
MatrixUtil:
{ Cholesky Factorization: LDL^T} Algorithm: Cholesky_LDL.py
Large-Scale Unconstrained Optimization:
Inexact Newton method
{ Line Search Newton-CG } Algorithm: LineSearchNewton_CG.py
{ Limit memory-BFGS } Algorithm: L_BFGS.py
Calculating Derivatives:
Finite-Difference:
{ Numerical Differentiation } Algorithm: NumericalDifferentiation.py
Cvx demo :
using CvxOpt or Cvxpy package demo:
{ CvxOpt Solve LP } Demo: CvxOptSolveLPDemo.py
{ Cvxpy Solve LP } Demo: CvxpySolveLPDemo.py
{ Cvxpy Solve NLP } Demo: CvxpySolveNLPDemo.py
References
Jorge Nocedal and Stephen J.Wright :
Numerical optimizationSecond Edition
Stephen Boyd and Lieven vandenberghe:
Convex optimization