0% found this document useful (0 votes)
22 views

ECEG-6311 Power System Optimization and AI: Contd.. Yoseph Mekonnen (PH.D.)

This document discusses classical optimization techniques for problems with equality constraints, including direct substitution and Lagrange multipliers. It provides examples of using each method to maximize an objective function subject to a single equality constraint, deriving and solving the necessary conditions for an extremum. The document also covers sufficient conditions, defining positive and negative definiteness of the Hessian matrix to determine if a solution is a maximum or minimum.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

ECEG-6311 Power System Optimization and AI: Contd.. Yoseph Mekonnen (PH.D.)

This document discusses classical optimization techniques for problems with equality constraints, including direct substitution and Lagrange multipliers. It provides examples of using each method to maximize an objective function subject to a single equality constraint, deriving and solving the necessary conditions for an extremum. The document also covers sufficient conditions, defining positive and negative definiteness of the Hessian matrix to determine if a solution is a maximum or minimum.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

ECEG-6311

Power System Optimization


and AI
Lecture 3:
Classical Optimization Techniques
Contd..
Yoseph Mekonnen (Ph.D.)

Page 1
Outlines
Multivariable Optimization with Equality Constraints
Direct substitution
constrained variation
Lagrange multipliers

Page 2
…Contd..
Necessary Condition
If f(X) has an extreme point (maximum or minimum) at X =
X∗ and if the first partial derivatives of f (X) exist at X∗ ,
then:

Sufficient Condition
A sufficient condition for a stationary point X∗ to be an
extreme point is that the matrix of second partial
derivatives (Hessian matrix) of f(X) evaluated at X∗ is
(i) positive definite when X∗ is a relative minimum point
(ii) negative definite when X∗ is a relative maximum point.

Page 3
..Contd..
A matrix A will be positive definite if all its eigen values
are positive; that is, all the values of λ that satisfy the
determinantal equation should be positive. Similarly, the
matrix [A] will be negative definite if its eigenvalues are
negative.

Another test that can be used to find the positive


definiteness of a matrix A of order n involves evaluation of
the determinants.

Page 4
..Contd..
The matrix A will be positive definite if and only if all the
values A1, A2, A3, . . . , An are positive.
The matrix A will be negative definite if and only if the
sign of Aj is (–1)j for j = 1, 2, . . . , n.
If some of the Aj are positive and the remaining Aj are
zero, the matrix A will be positive semidefinite.

Page 5
MULTIVARIABLE OPTIMIZATION WITH
EQUALITY CONSTRAINTS
Definition

Here m is less than or equal to n; otherwise (if m>n), the


problem becomes over defined and, in general, there will be
no solution.

Page 6
Solution by Direct Substitution
For a problem with n variables and m equality constraints,
it is theoretically possible to solve simultaneously the m
equality constraints and express any set of m variables in
terms of the remaining n − m variables.

When these expressions are substituted into the original


objective function, there results a new objective function
involving only n − m variables.
The new objective function is not subjected to any
constraint, and hence its optimum can be found by using the
unconstrained optimization techniques discussed.
Simply this process is converting the constrained
optimization to unconstructed optimization problem.

Page 7
..Contd..
Example
Maximize

Subject to

This problem has three design variables and one equality


constraint. Hence the equality constraint can be used to
eliminate any one of the design variables from the objective
function. If we choose to eliminate x3.

Page 8
..Contd..
Thus the objective function becomes

Now it can be maximized as an unconstrained function in


two variables.
Necessary Condition

simplified to obtain

Page 9
..Contd..
From which it follows that x∗1 = x∗2 = 1/√3 and hence x∗3 =
1/√3. This solution gives the maximum volume of the box as:

To find whether the solution found corresponds to a


maximum or a minimum, we apply the sufficiency conditions
to f (x1, x2) on f(x1, x2)

Simply we have to find the hessian Matrix

Page 10
..Contd..
Hessian Matrix

The Hessian matrix of f is negative definite at (x∗1 , x∗2 ).


Hence the point (x∗1 , x∗2 ) corresponds to the maximum of
f.

Page 11
Solution by the Method of Lagrange Multipliers
Problem with Two Variables and One Constraint
Consider the problem

A necessary condition for f to have a minimum at some point


(x∗1, x∗2 ) is that the total derivative of f (x1, x2) with
respect to x1 must be zero at (x∗1 , x∗2 ).
By setting the total differential of f (x1, x2) equal to zero,
we obtain

Page 12
..Contd..
The aforementioned expressions generalized to a three
variable expression called Lagrange function

By treating L as a function of the three variables x1, x2,


and λ, the necessary conditions for its extremum are given
by

Page 13
..Contd..
Sufficient Condition (General Case)
The following determinantal equation, depending on z be
positive (negative):

Page 14
..Contd..
Example

The Lagrange function is:

Necessary Condition

Page 15
..Contd..
Gives

The Maximum Becomes

If A0 = 24π, the optimum solution becomes

Page 16
..Contd..
Necessary Condition

Since the value of z is negative, the point (x∗1 , x∗2 )


corresponds to the maximum of f .

Page 17
Reading Assignment
Constrained variation

Page 18
Thank You!

Page 19

You might also like