Optmizationtechniques 150308051251 Conversion Gate01
Optmizationtechniques 150308051251 Conversion Gate01
TECHNIQUES
By:Dhruv Bansal
21U02049
Definition:
An optimization is the act of achieving the
best possible result under given
circumstances.
Primary objective may not be optimize
absolutely but to compromise effectively
&thereby produce the best formulation under a
given set of restrictions .
Why Optimization is necessary?
Reduce
the
cost
Safety &
Save optimizati
reduce
the time on
the error
Innovatio
n&
reproducibil efficienc
ity y
3
Historical development
IsaacNewton (1642-1727) :The development of differential calculus
methods of optimization.
Joseph-Louis Lagrange (1736-1813) :Calculus of variations,
minimization of functionals, method of optimization for constrained
problems.
Augustin-Louis Cauchy (1789-1857) :Solution by direct
substitution, steepest descent method for unconstrained optimization.
George Bernard Dantzig (1914-2005):Linear programming and
Simplex method (1947).
Albert William Tucker (1905-1995):Necessary and sufficient
conditions for the optimal solution of programming problems,
nonlinear programming.
OPTIMIZATION PARAMETERS
Objective function
An objective function expresses the main aim
of the model which is either to be minimized or
maximized.
For example: in a manufacturing process, the aim
may be to maximize the profit or minimize the cost.
The two exceptions are:
• No objective function
• Multiple objective functions.
Variables
A set of unknowns or variables control the value of
the objective function.
To find X =
which minimizes f(X)
Minimize f= f(X)
Subject to the constraints
gi(X) =0 , i = 1, 2, …., m
where X=
Here m ≤n; otherwise (if m > n), the problem becomes
over defined and, in general, there will be no solution.
There are several methods available for the solution of
this problem
Such methods are
1. Direct substitution
2 .Constrained variation
3. Lagrange multipliers
Solution by Direct Substitution
A problem with n variables and m equality constraints, ,
it is theoretically possible to solve simultaneously the m
equality constraints and express any set of m variables
in terms of the remaining n − m variables.
With these new objective unction is obtained.
Drawbacks
constraint equations will be nonlinear for most of
practical problems.
often it becomes impossible to solve them and express
any m variables in terms of the remaining n − m
variables.
By the Method of Constrained Variation
The basic idea used in the method of constrained
variation is to find a closed-form expression for
the first-order differential of f (df) at all points at
which the constraints gj (X) = 0, j = 1, 2, . . . , m,
are satisfied.
Drawback
Prohibitive for problems with
more than three constraints.
By The Method Of Lagrange Multipliers
For instance consider the optimization problem
maximize f(x1, x2)
subject to g(x1, x2) = c.
We introduce a new variable (λ) called a Lagrange
multiplier and Lagrange function is defined by
L(x1, x2, λ) = f (x1, x2) + λg(x1, x2)
By treating L as a function of the three variables
x1, x2, and λ, the necessary conditions for its extreme
are given by
∂L/∂x1(x1, x2, λ)= ∂f /∂x1(x1, x2)+ λ ∂g /∂x1(x1, x2) = 0
∂L/∂x2 (x1, x2, λ) = ∂f /∂x2 (x1, x2) + λ ∂g/ ∂x2 (x1, x2) = 0
∂L/ ∂λ (x1, x2, λ) = g(x1, x2) = 0
MULTIVARIABLE OPTIMIZATION WITH INEQUALITY
CONSTRAINTS
The inequality constraints can be transformed to
equality constraints by adding nonnegative slack
variables, y ^2 (j ), as
gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , m
where the values of the slack variables are
yet unknown. The problem now becomes
Gj (X, Y) = gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , m
where Y = {y1, y2, . . . , ym} T is the vector of
slack variables
This problem can be solved conveniently by the method
of Lagrange multipliers.
CONVEX PROGRAMMING PROBLEM
The optimization problem with inequality constraint is
called a convex programming problem if the objective
function f (X) and the constraint functions gj (X) are
convex.
A function is convex if its slope is non
decreasing or ∂2 f / ∂x2 ≥ 0. It is strictly convex if its slope
is continually increasing or ∂2 f / ∂x2 > 0 throughout the
function.
Concave function
A differentiable function f is concave on an interval if its
derivative function f ′ is decreasing on that interval: a
concave function has a decreasing slope.
Thank You