Cenaero
Cenaero
MEMBERS ZONE LOGIN PASSWORD Become a member Checkout our privacy policy Contact us

Optimization

 
Optimization is a mathematical procedure for finding the extrema of a function defined on a variable/parameter space.
 
When the optimum must satisfy some constraints which are generally defined as inequalities, the optimization is said to be constrained.
 
When we only look for an optimum in the vicinity of a starting point by descending or ascending an identified improvement direction (using heuristic, gradient, and/or Hessian) of a nonlinear function, it can be that the optimum found is only local. This means that another region can be found in the parametric space with a global optimum. Methods that perform such optimization are called local optimization methods. These methods rapidly converge and usually give accurate solutions but these are expensive in terms of computational effort, especially when many concurrent optimizations have to be performed from different starting points in order to find the global optimum.
 
On the other hand, methods that look for a global optimum are named global optimization methods.
 
When the optimization aims at optimizing concurrently two or more functions that are contradictory, there's usually not a single solution but a set of optimal solutions which represent the optimal trade-off. Such methods are called multi-objective optimization methods. (See also Pareto front - HYPERLINK)
 
Popular methods include:
  • Local Optimization methods:
    • Sequential Quadratic Programming - MATLAB User's Manual on Optimization ref. P. 3-29
    • Newton’s Method
    • Quasi-Newton's Methods
    • Steepest Descent Method
    • Adjoint Method
    • Generalized Reduced Gradient Method
    • Nelder-Mead Downhill Simplex Method
    • Coordinate Optimization
    • Rosenbrock Method
  • Global Optimization methods:
    • Genetic Algorithms
    • Simulated Annealing
    • Particle Swarm Optimization
    • Ant Colony Optimization
    • Random Search