Optimization

From DDL Wiki

Jump to: navigation, search

The term optimization (also often written as optimisation in Europe) refers to the process of

  • 1. formally defining a design or decision-making problem with a set of decision-variables and mathematical functions that define the problem objective and constraints, and
  • 2. using analytical or numerical methods to solve for values for the decision-variables that maximize (or minimize) the objective function subject to the constraints.


Most optimization texts focus on methods to solve the mathematical problem, since the process of formulation is more case-specific and since good formulation depends on strong knowledge of mathematical properties and solution methods. In practice, the translation of a practical decision-making problem into mathematical form is almost always done under a set of assumptions, and the numerical methods used to solve the mathematical problem often guarantee optimality only under specific conditions, so the processes of modeling and solving optimization problems are generally interlinked, and care must be taken when applying optimization methods and interpreting results.

Contents

Formulation

The major elements of an optimization problem are:

  • Objective: The goal to be minimized or maximized
  • Constraints: Conditions that must be satisfied
  • Variables: The quantities to be decided during optimization


The general optimization problem is written as: "minimize the objective, with respect to the decision variables, subject to the constraints." In mathematical form, using the wiki's notation conventions:


minimize f(\mathbf{x})
with respect to \mathbf{x}
subject to \mathbf{g(x) \leq 0}
\mathbf{h(x)=0}
\mathbf{x}\in\mathcal{X}


where \mathbf{x} is a vector of decision-variables, f(\mathbf{x}) is the objective function, \mathbf{g(x)} is the vector of inequality constraints, \mathbf{h(x)} is the vector of equality constraints, and \mathcal{X} is the set constraint defining the domain of \mathbf{x}.

Application Domains

Optimization is applied in many domains, since the goal of optimization is applicable to nearly any field where decisions are made, and since mathematical modeling has become a popular approach in many fields. The most relevant application areas to DDWiki are:

  • Design optimization: Optimization applied to the design of a product or process
  • Operations research: Optimization applied to operations decisions in an organization, such as scheduling, supply chain management, etc.


Problem Classification

Theory and numerical methods used for solving mathematical optimization problems vary widely based on the properties of the optimization problem. If the functions in a particular optimization problem have special properties, then special theories and numerical methods exist to find solutions. Entire books are devoted to theory and methods for each of these cases:


\underset{\mathbf{x}}{\mathrm{minimize}} f(\mathbf{x})
\mathrm{s.t.}\ \mathbf{g(x) \leq 0}
\mathbf{h(x)=0}
\mathbf{x}\in\mathcal{X}
Variables
Continuous
\mathcal{X}=\mathbb{R}^n
Integer (IP)
\mathcal{X}=\mathbb{Z}^m
Mixed (MIP)
\mathcal{X}=\mathbb{R}^{n}\times\mathbb{Z}^{m}
F
u
n
c
t
i
o
n
s
Linear
f(\mathbf{x}), \mathbf{g(x)}, and \mathbf{h(x)} are linear functions
LP
Linear programming
ILP
Integer linear programming
MILP
Mixed-integer linear programming
Convex (Nonlinear)
f(\mathbf{x}) and \mathbf{g(x)} are convex functions and \mathbf{h(x)} are linear functions
Convex NLP
Convex nonlinear programming
Convex INLP
Convex integer nonlinear programming
Convex MINLP
Convex mixed-integer nonlinear programming
Nonlinear (General)
f(\mathbf{x}), \mathbf{g(x)}, and \mathbf{h(x)} are general functions
NLP
Nonlinear programming
INLP
Integer nonlinear programming
MINLP
Mixed-integer nonlinear programming

Variable Domain

Continuous variables can take values on the real number line. Examples include the dimensions of a part, temperature, voltage, etc. These formulations are typically easier to solve because optimality conditions are satisfied only at certain special points in the space, and the function gradients can be used to search for such points. In contrast, integer variables are restricted to take a discrete set of values. Examples include binary {0,1} variables that represent on/off (e.g., the decision to include or not include a particular feature in the design) or dimensions that must be selected from a standard set of sizes (e.g.: standard bar stock diameters). These problems are typically more difficult to solve because the integer restrictions create a combinatorial domain (many combinations to search, rather than a smooth multidimensional function). Mixed problems include both types of variables.

Functional Form

Functional forms make a big difference in what is known theoretically about the problem and how efficiently it can be solved. Linear formulations are common in operations research applications, and such problems commonly have thousands of variables and constraints. If a problem is linear, there are many special properties that can be exploited.

Convex nonlinear formulations have the special property that they contain only one local minimum (i.e.: a point that is better than all immediate neighbors) that is also the global minimum (i.e.: a point that is better than all other points in the entire space). Because of this, algorithms that use function gradients to move downhill toward a local solution will eventually arrive at the global solution.

If the problem is nonconvex, however, this property may not hold, and a simple downhill-moving algorithm will reach a local minimum but not necessarily the global minimum. In this case, the particular local minimum that a gradient-based algorithm will find typically depends on the starting point (i.e.: the initial guess for \mathbf{x}_0). A simple approach is to apply multistart, where multiple starting points are used to find multiple local minima in search of the global minimum.

Alternatively, methods such as branch and bound, or it's variants, may be used to search for the global minimum by generating and tightening upper and lower bounds on the solution. Stochastic optimization methods, such as genetic algorithms or simulated annealing, can also be used. These methods use random number generators or probabilistic operators in a way that tends to push the solution toward global optima. Typically, stochastic algorithms provide no proof of optimality, but they can offer practical engineering approaches to finding good solutions to difficult problems.

Optimization Software

The table below lists optimization solvers by package and identifies capabilities of each. More detail can be found on the pages for the corresponding problem class (e.g.: MINLP). Additionally, the NEOS server website has information about a range of solvers classified by problem type here.

C=convex, M=mixed, I=integer, N=non, L=linear, P=programming, g=constraint handling.

Solver Environ- ment Algorithm



L
P


I

L
P

M
I

L
P
C


N
L
P
C

I
N
L
P
C
M
I
N
L
P



N
L
P


I
N
L
P

M
I
N
L
P
g Author Notes
BARON GAMS AMPL Branch and reduce, calls other convex solvers. X X X X Nikolaos Sahinidis Req algebraic func. Poor perf on eq const. Scaling nonconvex terms is critical.
CONOPT AIMMS AMPL GAMS Matlab GRG X X X ARKI GRG algorithm with new extensions
CPLEX AIMMS AMPL GAMS Simplex & interior point method X X X X ILOG
DICOPT GAMS Branch and bound calls other NLP/MILP solvers X X Ignacio Grossmann
DONLP2 AMPL Fortran C SQP X X Peter Spellucci
Excel Solver (basic) Excel GRG, Branch and bound X X X X X X X Frontline Systems
fmincon Matlab SQP X X X Mathwork
fminunc Matlab SQP X X Mathwork
fminsearch Matlab Downhill Simplex X Mathwork derivative-free Nelder-Mead method for unconstrained problems
linprog Matlab Simplex and interior point method X X Mathwork Interior point method is used for large-scale problem
MINOS AMPL Fortran GAMS GRG X X X B. Murtagh, M. Saunders Large-scale solver
SBB GAMS GRG & Branch and bound X X ARKI
SNOPT AMPL Fortran GAMS Matlab SQP X X X X P. Gill, W. Murray, M. Saunders Large-scale solver
LGO C/C++/C# Fortran GAMS Matlab LGO X X X X X X X Prof. Janos Pinter Global optimization solver
Personal tools