Optimization Problems: 1.1 Preliminary Definitions
Optimization Problems: 1.1 Preliminary Definitions
Optimization problems
max f (x)
xS
coincide with the optimal solutions of the minimization problem
min f (x)
xS
and we have: max f (x) = min( f (x)).
xS xS
The feasible set S is a subset of Rn and hence x = (x1 , x2 , . . . , xn )T is the vector of variables
of dimension n and f is a function of n real values f (x1 , x2 , . . . , xn ).
1
2 CHAPTER 1. OPTIMIZATION PROBLEMS
Definition 1.3 (Unbounded Problem) The minimization problems is unbounded below if for
any value M > 0 a point x S exists such that f (x) < M.
Definition 1.5 (Local minimizer) A point x S is a local minimizer if there exists a neigh-
borhood N (x, ) of x such that
Of course any global minimizer is also a local one, but not vice versa.
It can also happen that the objective function is bounded below on S, namely that:
- verify if an optimal solution exists or prove that the problem do not admit solutions;
Continuous Optimization
The variables x can take values in Rn (continuous values); we can further distinguish
in
unconstrained problems if S Rn
constrained problems if S = Rn .
Discrete Optimization.
The variables x can take values only on a finite set; we can further distinguish in:
integer programming if S Zn
boolean optimization if S {0, 1}n .
Mixed problems.
Some of the variables are continuous and soe are discrete.
The feasible set is usually expressed by a finite number of equality or inequality relations.
Formally consider the functions gi : Rn R, i = 1, . . . , m a feasible set defined by inequality
constraints is
S = {x Rn | g1 (x) 0, g2 (x) 0, . . . , gm (x) 0} .
Each inequality gi (x) 0 is called constraint and the feasible set is made up of points solving
the system of nonlinear inequalitiees
g1 (x) 0
g2 (x) 0
g3 (x) 0
..
.
gm (x) 0
Please note that any constraint of the form g(x) 0 can be reported in the form above by
simple multiplying by minus one, namely g(x) 0. Further an equality constraint h(x) = 0 can
4 CHAPTER 1. OPTIMIZATION PROBLEMS
also be transformed into two inequality constraints as h(x) 0 e h(x) 0. However in some
cases equality constraints can be treated explicity, so that we consider a generic problem of the
form
min f (x)
gi (x) 0, i = 1, . . . , m (1.2)
h j (x) = 0, j = 1, . . . , p.
or in moore compact form as
min f (x)
(1.3)
g(x) 0,
h(x) = 0
where g : Rn Rm h : Rn R p .
The optimization problem is called linear program or Linear programming problem (LP) if
the objective and constraint functions f ,g1 , . . . , gm , h1 . . . .h p are linear, i.e., satisfy the condition
f (x + y) = f (x) + f (y)
for all x, y Rn and all , R.
min c1 x1 + c2 x2 + + cn xn
(1.4)
ai1 x1 + ... + ain xn ( / =)bi
If the optimization problem is not linear (namely at least one of the funtion is not linear), it
is called a nonlinear program or Nonlinear programming problem (NLP)
Some example fo Mathematical programming problem follow.
Example 1.7 Consider the minimization of the function in two variables f (x1 , x2 ) = 2x1 + x2
under the constraints 2x1 + x2 1, x1 0, x2 0. The optimization problems is
min 2x1 + x2
x1 + x2 1
x1 0
x2 0
which is in the form (1.2) where g1 (x1 , x2 ) = x1 + x2 1, g2 (x1 , x2 ) = x1 , g3 (x1 , x2 ) = x2 . This
is a Linear programming problem.
Example 1.8 Let us consider the minimization of the function f (x1 , x2 ) = (x1 21 )2 + (x2 12 )2
subject to the constraints x1 + x2 1, x1 1, x2 1. We get the nonlinearprogranning problem
min (x1 21 )2 (x2 12 )2
x1 + x2 1
x1 1
x2 1