0% found this document useful (0 votes)
40 views

Unit 4 (Optimization)

The document discusses optimization methods for one-dimensional and multi-dimensional problems, including the golden section search, Newton's method, random search, and gradient method. It provides examples of applying the golden section search and Newton's method to maximize the function f(x) = 2sin(x) - x^2/10. The examples show initializing the domain bounds and iteratively applying the methods to converge on the maximum value.

Uploaded by

ebrahim5936
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Unit 4 (Optimization)

The document discusses optimization methods for one-dimensional and multi-dimensional problems, including the golden section search, Newton's method, random search, and gradient method. It provides examples of applying the golden section search and Newton's method to maximize the function f(x) = 2sin(x) - x^2/10. The examples show initializing the domain bounds and iteratively applying the methods to converge on the maximum value.

Uploaded by

ebrahim5936
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

Numerical Methods (CIE-301)

Unit-4
(Optimization)
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Contents (Unit-4):
1) One-dimensional Unconstrained Optimization
 Golden-Section Search (Sec 13.1)

 Newton’s Method (Sec 13.3)

2) Multi-dimensional Unconstrained Optimization


 Random Search (Sec 14.1)

 Gradient Method (Sec 14.2)

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


One-dimensional Unconstrained
Optimization
1-D Unconstrained Optimization:
 If the function is monotonically increasing for 𝑥𝑥 ≤ 𝑚𝑚 and monotonically decreasing for
𝑥𝑥 ≥ 𝑚𝑚 𝒇𝒇(𝒙𝒙)

50
 If so, the max/min value of 𝑓𝑓(𝑥𝑥) is 𝑓𝑓 𝑚𝑚 . 𝒇𝒇(𝒙𝒙)= 𝒇𝒇(𝒎𝒎)

40
 Having only one peak/dip and there is no other
local maxima/minima.

30
20
10
0
0 5 10 15 20 25 30

Dr. Muhammad Majid Gulzar (CIE-KFUPM) 𝑚𝑚 𝒙𝒙


Golden-Section Search (Sec 13.1)
Golden-Section Search:
 This method is an example of a bracketing method that depends on initial guesses that
bracket a single optimum.

 Rather than using only 2 function values (which are sufficient to detect a sign change
and hence a zero), we would need 3 function values to detect whether a
maximum/minimum occurred.

 The initial step of the golden-section search algorithm involves choosing two interior
points according to the golden ratio.

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Golden-Section Search:
𝑙𝑙0 = 𝑙𝑙1 + 𝑙𝑙2
Must Hold These
Conditions 𝑙𝑙1 𝑙𝑙2
=
𝑙𝑙0 𝑙𝑙1

𝑙𝑙1 𝑙𝑙2
=
𝑙𝑙1 + 𝑙𝑙2 𝑙𝑙1

𝑙𝑙1 + 𝑙𝑙2 𝑙𝑙1


=
𝑙𝑙1 𝑙𝑙2

𝑙𝑙2 𝑙𝑙1
1+ =
𝑙𝑙1 𝑙𝑙2

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Golden-Section Search:
𝑙𝑙2 𝑙𝑙1 1 𝑙𝑙2
1+ = −−→ 1 + 𝑅𝑅 = ∵ 𝑅𝑅 =
𝑙𝑙1 𝑙𝑙2 𝑅𝑅 𝑙𝑙1

𝑅𝑅2 + 𝑅𝑅 − 1 =0

−𝑏𝑏 ± 𝑏𝑏 2 − 4𝑎𝑎𝑎𝑎
𝑥𝑥 =
2𝑎𝑎

For Positive Roots

−1 + 1 − 4(−1) 5−1
𝑥𝑥 = = = 𝟎𝟎. 𝟔𝟔𝟔𝟔𝟔𝟔𝟔𝟔𝟔𝟔
2 2
Golden Ratio

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Golden-Section Search:
5−1
𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙
2

𝑥𝑥1 = 𝑥𝑥𝑙𝑙 + 𝑑𝑑
𝑥𝑥2 = 𝑥𝑥𝑢𝑢 − 𝑑𝑑

𝑓𝑓 𝑥𝑥1 < 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥1 = 𝑥𝑥𝑢𝑢

𝑓𝑓 𝑥𝑥1 > 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥2 = 𝑥𝑥𝑙𝑙

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Golden-Section Search (Example):
𝑥𝑥 2
𝑓𝑓 𝑥𝑥 = 2 sin(𝑥𝑥) − , 𝑥𝑥𝑙𝑙 = 0, 𝑥𝑥𝑢𝑢 = 4
10
0 −− −1.528 −− −2.472 −− −4
5−1 5−1
𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙 = 4 − 0 = 2.472
2 2

𝑥𝑥1 = 𝑥𝑥𝑙𝑙 + 𝑑𝑑 = 0 + 2.472 = 2.472


𝑥𝑥2 = 𝑥𝑥𝑢𝑢 − 𝑑𝑑 = 4 − 2.472 = 1.528

2.4722
𝑓𝑓 𝑥𝑥1 = 𝑓𝑓 2.472 = 2 sin(2.472) − = 𝟎𝟎. 𝟔𝟔𝟔𝟔
10
1.5282
𝑓𝑓 𝑥𝑥2 = 𝑓𝑓 1.528 = 2 sin(1.528) − = 𝟏𝟏. 𝟕𝟕𝟕𝟕𝟕𝟕
10

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Golden-Section Search (Example):
𝑓𝑓 𝑥𝑥1 < 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥1 = 𝑥𝑥𝑢𝑢 0 −− −1.528 −− −2.472 −− −4

𝑥𝑥𝑙𝑙 = 0, 𝑥𝑥2 = 𝑥𝑥1 = 1.582, 𝑥𝑥1 = 𝑥𝑥𝑢𝑢 = 2.472


0 −− −0.944 −− −1.528 −− −2.472
5−1 5−1
𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙 = 2.472 − 0 = 1.528
2 2

𝑥𝑥1 = 𝑥𝑥𝑙𝑙 + 𝑑𝑑 = 0 + 1.528 = 1.528


𝑥𝑥2 = 𝑥𝑥𝑢𝑢 − 𝑑𝑑 = 2.472 − 1.528 = 0.944

1.5282
𝑓𝑓 𝑥𝑥1 = 𝑓𝑓 1.528 = 2 sin(1.528) − = 𝟏𝟏. 𝟕𝟕𝟕𝟕𝟕𝟕
10
0.9442
𝑓𝑓 𝑥𝑥2 = 𝑓𝑓 0.944 = 2 sin(0.944) − = 𝟏𝟏. 𝟓𝟓𝟓𝟓𝟓𝟓
10
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Golden-Section Search (Example):
𝑓𝑓 𝑥𝑥1 > 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥2 = 𝑥𝑥𝑙𝑙 0 −− −0.944 −− −1.528 −− −2.472

𝑥𝑥2 = 𝑥𝑥𝑙𝑙 = 0.944, 𝑥𝑥1 = 𝑥𝑥2 = 1.582, 𝑥𝑥𝑢𝑢 = 2.472


0.944 −− −1.528 −− −1.888 −− −2.472
5−1 5−1
𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙 = 2.472 − 0.944 = 0.944
2 2

𝑥𝑥1 = 𝑥𝑥𝑙𝑙 + 𝑑𝑑 = 0.944 + 0.944 = 1.888


𝑥𝑥2 = 𝑥𝑥𝑢𝑢 − 𝑑𝑑 = 2.472 − 0.944 = 1.528

1.8882
𝑓𝑓 𝑥𝑥1 = 𝑓𝑓 1.888 = 2 sin(1.888) − = 𝟏𝟏. 𝟓𝟓𝟓𝟓𝟓𝟓
10
1.5282
𝑓𝑓 𝑥𝑥2 = 𝑓𝑓 1.528 = 2 sin(1.528) − = 𝟏𝟏. 𝟕𝟕𝟕𝟕𝟕𝟕
10
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Golden-Section Search (Example):

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Newton’s Method (Sec 13.3)
Newton’s Method:
 Newton’s method is an open method similar to Newton-Raphson because it does not
require initial guesses that bracket the optimum.


𝑓𝑓(𝑥𝑥𝑖𝑖 )
𝑥𝑥𝑖𝑖+1 = 𝑥𝑥𝑖𝑖 − ′′
𝑓𝑓(𝑥𝑥 𝑖𝑖 )

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Newton’s Method (Example):
𝑥𝑥 2
𝑓𝑓 𝑥𝑥 = 2 sin(𝑥𝑥) − , 𝑥𝑥0 = 2.5
10

𝑥𝑥 1

𝑓𝑓(𝑥𝑥) = 2 cos(𝑥𝑥) − , 𝑓𝑓 ′′𝑥𝑥 = −2 sin(𝑥𝑥) −
5 5

′ 𝑥𝑥𝑖𝑖
𝑓𝑓(𝑥𝑥 2 cos(𝑥𝑥𝑖𝑖 ) −
𝑥𝑥𝑖𝑖+1 = 𝑥𝑥𝑖𝑖 − 𝑖𝑖 )
= 𝑥𝑥𝑖𝑖 − 5
′′ 1
𝑓𝑓(𝑥𝑥𝑖𝑖 ) −2 sin(𝑥𝑥𝑖𝑖 ) −
5

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Newton’s Method (Example):
′ 2.5
𝑓𝑓(𝑥𝑥 2 cos 2.5 −
𝐹𝐹𝐹𝐹𝐹𝐹 𝑖𝑖 = 0, 𝑥𝑥1 = 𝑥𝑥0 − 0)
= 2.5 − 5 = 𝟎𝟎. 𝟗𝟗𝟗𝟗𝟗𝟗
′′ 1
𝑓𝑓(𝑥𝑥0) −2 sin 2.5 −
5

0.9952
𝑓𝑓 𝑥𝑥1 = 𝑓𝑓(0.995) = 2 sin 0.995 − = 𝟏𝟏. 𝟓𝟓𝟓𝟓𝟓𝟓
10

′ 0.995
𝑓𝑓(𝑥𝑥 2 cos 0.995 −
𝐹𝐹𝐹𝐹𝐹𝐹 𝑖𝑖 = 1, 𝑥𝑥2 = 𝑥𝑥1 − 1)
= 0.995 − 5 = 𝟏𝟏. 𝟒𝟒𝟒𝟒𝟒𝟒
′′ 1
𝑓𝑓(𝑥𝑥 1) −2 sin 0.995 −
5

1.4692
𝑓𝑓 𝑥𝑥2 = 𝑓𝑓(1.469) = 2 sin 1.469 − = 𝟏𝟏. 𝟕𝟕𝟕𝟕𝟕𝟕
10
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Newton’s Method (Example):

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Class Activity (Examples):
5−1
Perform 3 iterations to maximize, 𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙
2
𝑓𝑓 𝑥𝑥 = 4𝑥𝑥 − 1.8𝑥𝑥 2 + 1.2𝑥𝑥 3 − 0.3𝑥𝑥 4
𝑥𝑥1 = 𝑥𝑥𝑙𝑙 + 𝑑𝑑
𝑥𝑥2 = 𝑥𝑥𝑢𝑢 − 𝑑𝑑
a) Using Golden-Section Search (𝑥𝑥𝑙𝑙 = −2, 𝑥𝑥𝑢𝑢 = 4)
b) Using Newton’s Method (𝑥𝑥0 = 3) 𝑓𝑓 𝑥𝑥1 < 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥1 = 𝑥𝑥𝑢𝑢

𝑓𝑓 𝑥𝑥1 > 𝑓𝑓 𝑥𝑥2 −−→ 𝑥𝑥2 = 𝑥𝑥𝑙𝑙

Golden-Section Search Newton’s Method


5−1 𝑓𝑓 𝑥𝑥1 = 𝟓𝟓. 𝟕𝟕𝟕𝟕𝟕𝟕𝟕𝟕
𝑑𝑑 = 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙 = 3.7082 & 2.2918 & 1.4164 ′
𝑓𝑓(𝑥𝑥
2 𝑖𝑖 )
𝑓𝑓 𝑥𝑥2 = 𝟓𝟓. 𝟖𝟖𝟖𝟖𝟖𝟖𝟖𝟖 𝑥𝑥𝑖𝑖+1 = 𝑥𝑥𝑖𝑖 − ′′
𝑓𝑓 𝑥𝑥1 = 𝟓𝟓. 𝟎𝟎𝟎𝟎𝟎𝟎𝟎𝟎 & 𝟓𝟓. 𝟔𝟔𝟔𝟔𝟔𝟔𝟔𝟔 & 𝟐𝟐. 𝟗𝟗𝟗𝟗𝟗𝟗𝟗𝟗 𝑓𝑓(𝑥𝑥 𝑖𝑖 )
𝑓𝑓 𝑥𝑥3 = 𝟓𝟓. 𝟖𝟖𝟖𝟖𝟖𝟖𝟖𝟖
𝑓𝑓 𝑥𝑥2 = 𝟏𝟏. 𝟎𝟎𝟎𝟎𝟎𝟎𝟎𝟎 & 𝟓𝟓. 𝟎𝟎𝟎𝟎𝟎𝟎𝟎𝟎 & 𝟓𝟓. 𝟔𝟔𝟔𝟔𝟔𝟔𝟔𝟔

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi-dimensional Unconstrained
Optimization
Multi-D Unconstrained Optimization :
 If the function has multiple global and local optima

𝒇𝒇(𝒙𝒙)

50
 Most of the real life optimization problems are

40
multimodal.

30
20
10
0
0 5 10 15 20 25 30

Dr. Muhammad Majid Gulzar (CIE-KFUPM) 𝒙𝒙


Optimality (Minimization/Maximization):
 Local optima (Minimization)
 Smallest function value in its neighborhood. 𝒇𝒇(𝒙𝒙)

50
 There can be multiple local optima solutions.

 Global optima (Minimization)

40
 Smallest function value in the feasible region.

30
 If the function is convex, only global optima

20
exists (There will be no local optima).
 For multimodal functions, most algorithms fail

10
to determine the global optima.

0
0 5 10 15 20 25 30

Dr. Muhammad Majid Gulzar (CIE-KFUPM) 𝒙𝒙


Random Search (Sec 14.1)
Random Search:
 This method repeatedly evaluates the function at randomly selected values of the
independent variables.

 It always finds the global optimum rather than a local optimum.

 Its major shortcoming is that as the number of independent variables grows, the
implementation effort required can become onerous.

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Random Search (Example):
𝑓𝑓 𝑥𝑥, 𝑦𝑦 = 𝑦𝑦 − 𝑥𝑥 − 2𝑥𝑥 2 − 2𝑥𝑥𝑥𝑥 − 𝑦𝑦 2 , 𝑥𝑥 ∈ −2, 2 , 𝑦𝑦 ∈ 1, 3

 For a random number 𝑟𝑟 ∈ 0, 1

𝑥𝑥 = 𝑥𝑥𝑙𝑙 + 𝑥𝑥𝑢𝑢 − 𝑥𝑥𝑙𝑙 𝑟𝑟 𝑦𝑦 = 𝑦𝑦𝑙𝑙 + 𝑦𝑦𝑢𝑢 − 𝑦𝑦𝑙𝑙 𝑟𝑟

𝑥𝑥 = −2 + 2 + 2 𝑟𝑟 𝑦𝑦 = 1 + 3 − 1 𝑟𝑟

𝑥𝑥 = −2 + 4𝑟𝑟 𝑦𝑦 = 1 + 2𝑟𝑟
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Random Search (Example):
maxf = −1E9

For j = 1 To n
x = −2 + 4 * Rnd
y = 1 + 2 * Rnd
fn = y − x − 2 * x ^ 2 − 2 * x * y − y ^ 2
If fn > maxf Then
maxf = fn
maxx = x
maxy = y
End If

Next j

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Random Search (Example):

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Gradient Method (Sec 14.2)
Gradient Method:
 Gradient methods explicitly use derivative information to generate efficient algorithms
to locate optima.

 The first derivative may tell us when we have reached an optimal value since this is the
point that the derivative goes to zero.

 Further, the sign of the second derivative can tell us whether we have reached a
minimum (positive second derivative) or a maximum (negative second derivative).

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Gradient Method (Example):
𝑓𝑓 𝑥𝑥, 𝑦𝑦 = 𝑥𝑥𝑦𝑦 2 , 𝑥𝑥, 𝑦𝑦 = (2, 2)

𝑓𝑓 2,2 = 2 22 = 8

𝜕𝜕𝑓𝑓
= 𝑦𝑦 2 = 22 = 4
𝜕𝜕𝑥𝑥

𝜕𝜕𝜕𝜕
= 2𝑥𝑥𝑥𝑥 = 2(2)(2) = 8
𝜕𝜕𝑦𝑦

∇𝑓𝑓 = 𝑥𝑥𝑥𝑥 + 𝑦𝑦𝑦𝑦 = 4𝑖𝑖 + 8𝑗𝑗

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Gradient Method (Example):
∇𝑓𝑓 = 𝑥𝑥𝑥𝑥 + 𝑦𝑦𝑦𝑦 = 4𝑖𝑖 + 8𝑗𝑗

−1
8
𝜃𝜃 = tan = 𝟏𝟏. 𝟏𝟏𝟏𝟏𝟏𝟏 𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓 (= 𝟔𝟔𝟔𝟔. 𝟒𝟒𝒐𝒐 )
4

∇𝑓𝑓 = 42 + 82 = 𝟖𝟖. 𝟗𝟗𝟗𝟗𝟗𝟗


(OR)
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕
𝑔𝑔′ 0 = cos 𝜃𝜃 + sin 𝜃𝜃
𝜕𝜕𝜕𝜕 𝜕𝜕𝑦𝑦

𝑔𝑔′ 0 = 4 cos(1.107) + 8 sin(1.107) = 𝟖𝟖. 𝟗𝟗𝟗𝟗𝟗𝟗

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Maxima/Minima/Saddle Point:
 Maximum or minimum is located at the stationary point.
1) Stationary point can be determined by equating the gradient (Jacobian of the function to zero)
2) Second derivative (Hessian) at the stationary point.

2
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓
𝐻𝐻 = 2 2 −
𝜕𝜕𝑥𝑥 𝜕𝜕𝑦𝑦 𝜕𝜕𝜕𝜕𝜕𝜕𝑦𝑦

Hessian (2nd Derivative)


Nature of Extremum
Single Variable Multi Variable
Positive 𝐻𝐻 > 0 & 𝜕𝜕 2 𝑓𝑓⁄𝜕𝜕𝑥𝑥 2 > 0 Minimum

Negative 𝐻𝐻 > 0 & 𝜕𝜕 2 𝑓𝑓⁄𝜕𝜕𝑥𝑥 2 < 0 Maximum


Saddle Point
0 𝐻𝐻 < 0
Dr. Muhammad Majid Gulzar (CIE-KFUPM) (Not possible to decide)
Single Variable Function:
𝑫𝑫𝟐𝟐 = 𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝒎𝒎
𝟏𝟏𝒔𝒔𝒔𝒔 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫 𝑮𝑮𝑮𝑮𝑮𝑮 𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝟐𝟐𝒏𝒏𝒏𝒏 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫
𝑫𝑫𝟐𝟐 = 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷 (𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴)
𝑫𝑫𝑫𝑫 = 𝟎𝟎 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷 𝑫𝑫𝟐𝟐 = 𝟎𝟎
𝑫𝑫𝟐𝟐 = 𝒁𝒁𝒁𝒁𝒁𝒁𝒁𝒁 (𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷)

Multi Variable Function:

𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬 𝒕𝒕𝒕𝒕 𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬 𝒕𝒕𝒕𝒕 𝑯𝑯 > 𝟎𝟎 & 𝝏𝝏𝟐𝟐 𝒇𝒇⁄𝝏𝝏𝝏𝝏𝟐𝟐 < 𝟎𝟎 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴
𝑮𝑮𝑮𝑮𝑮𝑮 𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺
𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱 𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯 𝑯𝑯 > 𝟎𝟎 & 𝝏𝝏𝟐𝟐 𝒇𝒇⁄𝝏𝝏𝝏𝝏𝟐𝟐 > 𝟎𝟎 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴
𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷
𝑱𝑱 = 𝟎𝟎 𝑯𝑯 = 𝟎𝟎 𝑯𝑯 < 𝟎𝟎 𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Single Variable Function (Example):
𝑓𝑓 𝑥𝑥 = 𝑥𝑥 3 + 3𝑥𝑥 2 − 6𝑥𝑥
Maxima or Minima occurs at the stationary point.

𝑑𝑑𝑑𝑑
Jacobian = 𝐽𝐽 = = 3𝑥𝑥 2 + 6𝑥𝑥 − 6
𝑑𝑑𝑑𝑑

3𝑥𝑥 2 + 6𝑥𝑥 − 6 = 0

No. of stationary points will be 2

−𝑏𝑏 ± 𝑏𝑏2 − 4𝑎𝑎𝑎𝑎


𝑥𝑥 =
2𝑎𝑎

𝑥𝑥 = 0.732, −2.732

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Single Variable Function (Example):
𝑓𝑓 𝑥𝑥 = 𝑥𝑥 3 + 3𝑥𝑥 2 − 6𝑥𝑥
Maxima or Minima occurs at the stationary point.

𝑑𝑑𝑑𝑑
Jacobian = 𝐽𝐽 = = 3𝑥𝑥 2 + 6𝑥𝑥 − 6
𝑑𝑑𝑑𝑑

𝑑𝑑 2 𝑓𝑓
Hessian = 𝐻𝐻 = 𝑓𝑓 ′′ = = 6𝑥𝑥 + 6
𝑑𝑑𝑑𝑑 2

𝑥𝑥 𝑓𝑓 ′′ 𝑓𝑓 Extremum Maxima

Minima
0.732 10.392 −2.39 Minimum

−2.732 −10.392 18.39 Maximum

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
𝑓𝑓 𝑥𝑥, 𝑦𝑦 = 2𝑥𝑥𝑥𝑥 + 2𝑥𝑥 − 𝑥𝑥 2 − 2𝑦𝑦 2

𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= 2𝑦𝑦 + 2 − 2𝑥𝑥
𝜕𝜕𝑥𝑥 𝜕𝜕𝑥𝑥
Jacobian = 𝐽𝐽 = 𝜕𝜕𝑓𝑓 → 𝜕𝜕𝑓𝑓
= 2𝑥𝑥 − 4𝑦𝑦
𝜕𝜕𝑦𝑦 𝜕𝜕𝑦𝑦

2𝑦𝑦 + 2 − 2𝑥𝑥 = 0
𝐽𝐽 = 0 →
2𝑥𝑥 − 4𝑦𝑦 = 0

2𝑦𝑦 + 2 = 2𝑥𝑥
2𝑥𝑥 = 4𝑦𝑦

𝑦𝑦 = 1 𝑥𝑥 = 2

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
𝜕𝜕𝑓𝑓
= 2𝑦𝑦 + 2 − 2𝑥𝑥
𝜕𝜕𝜕𝜕
𝜕𝜕𝑓𝑓
= 2𝑥𝑥 − 4𝑦𝑦
𝜕𝜕𝜕𝜕
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓
= −2 =2
𝜕𝜕𝑥𝑥 2 𝜕𝜕𝑥𝑥𝜕𝜕𝜕𝜕
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓
2
= −4 =2
𝜕𝜕𝑦𝑦 𝜕𝜕𝜕𝜕𝜕𝜕𝜕𝜕
2
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓
𝐻𝐻 = 2 2 −
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕𝜕𝜕𝑦𝑦

2
𝐻𝐻 = −2 −4 − 2 =4

𝐻𝐻 > 0 & 𝜕𝜕 2 𝑓𝑓⁄𝜕𝜕𝜕𝜕 2 < 0

𝒇𝒇 𝟐𝟐, 𝟏𝟏 = 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Maxima/Minima/Saddle Point:
 Maximum or minimum is located at the stationary point.
1) Stationary point can be determined by equating the gradient (Jacobian of the function to zero)

2) Second derivative (Hessian) at the stationary point.

Hessian (2nd Derivative)


Nature of Extremum
Single Variable Multi Variable

Positive Positive Definite Minimum

Negative Negative Definite Maximum

Saddle Point
0 Indefinite
(Not possible to decide)
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Maxima/Minima/Saddle Point:
 Maximum or minimum is located at the stationary point.
1) Stationary point can be determined by equating the gradient (Jacobian of the function to zero)

2) Second derivative (Hessian) at the stationary point.

Positive Definite A matrix is positive definite if determinants of all upper-left sub-matrices are positive.

Negative Definite A matrix is negative definite if determinants 𝐷𝐷𝑖𝑖 < 0 for odd 𝑖𝑖 and 𝐷𝐷𝑖𝑖 > 0 for even 𝑖𝑖 .

Indefinite Determinate of the matrix is not zero and neither it is positive definite nor negative definite.

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Single Variable Function:
𝑫𝑫𝟐𝟐 = 𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝒎𝒎
𝟏𝟏𝒔𝒔𝒔𝒔 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫 𝑮𝑮𝑮𝑮𝑮𝑮 𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝟐𝟐𝒏𝒏𝒏𝒏 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫
𝑫𝑫𝟐𝟐 = 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷 (𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴)
𝑫𝑫𝑫𝑫 = 𝟎𝟎 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷 𝑫𝑫𝟐𝟐 = 𝟎𝟎
𝑫𝑫𝟐𝟐 = 𝒁𝒁𝒁𝒁𝒁𝒁𝒁𝒁 (𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷)

Multi Variable Function:

𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬 𝒕𝒕𝒕𝒕 𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬𝑬 𝒕𝒕𝒕𝒕 𝑯𝑯𝒊𝒊 = 𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵𝑵 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫 𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴


𝑮𝑮𝑮𝑮𝑮𝑮 𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺
𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱𝑱 𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯𝑯 𝑯𝑯𝒊𝒊 = 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷 𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫𝑫 (𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴)
𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷
𝑱𝑱 = 𝟎𝟎 𝑯𝑯 = 𝟎𝟎 𝑯𝑯𝒊𝒊 = 𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰𝑰 (𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺𝑺 𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷𝑷)

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
𝑓𝑓 𝑥𝑥 = 3𝑥𝑥12 + 2𝑥𝑥22 + 𝑥𝑥32 − 2𝑥𝑥1 𝑥𝑥2 + 2𝑥𝑥2 𝑥𝑥3 − 2𝑥𝑥1 𝑥𝑥3 − 6𝑥𝑥1 − 4𝑥𝑥2 − 2𝑥𝑥3

𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= 6𝑥𝑥1 − 2𝑥𝑥2 − 2𝑥𝑥3 − 6
𝜕𝜕𝑥𝑥1 𝜕𝜕𝑥𝑥1
6𝑥𝑥1 − 2𝑥𝑥2 − 2𝑥𝑥3 − 6
𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
Jacobian = 𝐽𝐽 = → = −2𝑥𝑥1 + 4𝑥𝑥2 + 2𝑥𝑥3 − 4 → 𝐽𝐽 = −2𝑥𝑥1 + 4𝑥𝑥2 + 2𝑥𝑥3 − 4
𝜕𝜕𝑥𝑥2 𝜕𝜕𝑥𝑥2 −2𝑥𝑥1 + 2𝑥𝑥2 + 2𝑥𝑥3 − 2
𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= −2𝑥𝑥1 + 2𝑥𝑥2 + 2𝑥𝑥3 − 2
𝜕𝜕𝑥𝑥3 𝜕𝜕𝑥𝑥3

No. of stationary points will be 3

6 −2 −2 𝑥𝑥1 6
𝐽𝐽 = 0 → −2 4 2 𝑥𝑥2 = 4
−2 2 2 𝑥𝑥3 2

𝐴𝐴𝐴𝐴 = 𝑏𝑏
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Multi Variable Functions (Example):
𝐴𝐴𝐴𝐴 = 𝑏𝑏 → 𝑥𝑥 = 𝐴𝐴−1 𝑏𝑏

6 −2 −2
𝐴𝐴 = −2 4 2
−2 2 2

6 −2 −2 ⋮ 1 0 0
𝐴𝐴−1 = −2 4 2 ⋮ 0 1 0
−2 2 2 ⋮ 0 0 1

1 −1⁄3 −1⁄3 ⋮ 1⁄6 0 0 𝑅𝑅1


𝐴𝐴−1 = −2 4 2 ⋮ 0 1 0 →
6
−2 2 2 ⋮ 0 0 1

1 −1⁄3 −1⁄3 ⋮ 1⁄6 0 0


𝑅𝑅2 + 2𝑅𝑅1
𝐴𝐴−1 = 0 10⁄3 4⁄3 ⋮ 1⁄3 1 0 →
𝑅𝑅3 + 2𝑅𝑅1
0 4⁄3 4⁄3 ⋮ 1⁄3 0 1
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Multi Variable Functions (Example):
1 −1⁄3 −1⁄3 ⋮ 1⁄6 0 0 3𝑅𝑅2
𝐴𝐴−1 = 0 1 2⁄5 ⋮ 1⁄10 3⁄10 0 →
10
0 4⁄3 4⁄3 ⋮ 1⁄3 0 1

𝑅𝑅2
1 0 −1⁄5 ⋮ 1⁄5 1⁄10 0 𝑅𝑅1 +
𝐴𝐴−1 = 0 1 2⁄5 ⋮ 1⁄10 3⁄10 0 → 3
4𝑅𝑅2
0 0 4⁄5 ⋮ 1⁄5 −2⁄5 1 𝑅𝑅3 −
3

1 0 −1⁄5 ⋮ 1⁄5 1⁄10 0 5𝑅𝑅3


−1
𝐴𝐴 = 0 1 2⁄5 ⋮ 1⁄10 3⁄10 0 →
4
0 0 1 ⋮ 1⁄4 −1⁄2 5⁄4

1𝑅𝑅3
1 0 0 ⋮ 1⁄4 0 1⁄4 𝑅𝑅1 +
𝐴𝐴−1 = 0 1 0 ⋮ 0 1⁄2 −1⁄2 → 5
2𝑅𝑅3
0 0 1 ⋮ 1⁄4 −1⁄2 5⁄4 𝑅𝑅2 −
5
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Multi Variable Functions (Example):
1⁄4 0 1⁄4
𝐴𝐴−1 = 0 1⁄2 −1⁄2
1⁄4 −1⁄2 5⁄4

𝐴𝐴𝐴𝐴 = 𝑏𝑏 → 𝑥𝑥 = 𝐴𝐴−1 𝑏𝑏

1⁄4 0 1⁄4 6 2
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 = 𝑥𝑥 = 0 1⁄2 −1⁄2 4 = 1
1⁄4 −1⁄2 5⁄4 2 2

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
𝑓𝑓 𝑥𝑥 = 3𝑥𝑥12 + 2𝑥𝑥22 + 𝑥𝑥32 − 2𝑥𝑥1 𝑥𝑥2 + 2𝑥𝑥2 𝑥𝑥3 − 2𝑥𝑥1 𝑥𝑥3 − 6𝑥𝑥1 − 4𝑥𝑥2 − 2𝑥𝑥3

𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= 6𝑥𝑥1 − 2𝑥𝑥2 − 2𝑥𝑥3 − 6
𝜕𝜕𝑥𝑥1 𝜕𝜕𝑥𝑥1
6𝑥𝑥1 − 2𝑥𝑥2 − 2𝑥𝑥3 − 6
𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
Jacobian = 𝐽𝐽 = → = −2𝑥𝑥1 + 4𝑥𝑥2 + 2𝑥𝑥3 − 4 → 𝐽𝐽 = −2𝑥𝑥1 + 4𝑥𝑥2 + 2𝑥𝑥3 − 4
𝜕𝜕𝑥𝑥2 𝜕𝜕𝑥𝑥2 −2𝑥𝑥1 + 2𝑥𝑥2 + 2𝑥𝑥3 − 2
𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= −2𝑥𝑥1 + 2𝑥𝑥2 + 2𝑥𝑥3 − 2
𝜕𝜕𝑥𝑥3 𝜕𝜕𝑥𝑥3

𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓
𝜕𝜕𝑥𝑥1 2 𝜕𝜕𝜕𝜕1 𝑥𝑥2 𝜕𝜕𝜕𝜕1 𝑥𝑥3
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 6 −2 −2
Hessian = 𝐻𝐻 = = −2 4 2
𝜕𝜕𝜕𝜕2 𝑥𝑥1 𝜕𝜕𝑥𝑥2 2 𝜕𝜕𝜕𝜕2 𝑥𝑥3
−2 2 2
𝜕𝜕 2 𝑓𝑓 𝜕𝜕 2 𝑓𝑓 2
𝜕𝜕 𝑓𝑓
𝜕𝜕𝜕𝜕3 𝑥𝑥1 𝜕𝜕𝜕𝜕3 𝑥𝑥2 𝜕𝜕𝑥𝑥3 2
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Multi Variable Functions (Example):
6 −2 −2
𝐻𝐻 = −2 4 2
−2 2 2

6 −2 −2
6 −2
𝐻𝐻1 = 6 > 0 𝐻𝐻2 = = 20 > 0 𝐻𝐻3 = −2 4 2 = 16 > 0
−2 4
−2 2 2

 As all 3 principle determinants are positive, so 𝐻𝐻 is positive definite matrix and the
stationary points (𝑥𝑥 = 2,1,2) corresponds to minima.

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
𝑓𝑓 𝑥𝑥 = 𝑥𝑥14 + 𝑥𝑥12 − 2𝑥𝑥12 𝑥𝑥2 + 2𝑥𝑥22 − 2𝑥𝑥1 𝑥𝑥2 + 4.5𝑥𝑥1 − 4𝑥𝑥2 + 4

𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= 4𝑥𝑥13 + 2𝑥𝑥1 − 4𝑥𝑥1 𝑥𝑥2 − 2𝑥𝑥2 + 4.5
𝜕𝜕𝑥𝑥1 𝜕𝜕𝑥𝑥1
Jacobian = 𝐽𝐽 = →
𝜕𝜕𝑓𝑓 𝜕𝜕𝑓𝑓
= −2𝑥𝑥12 + 4𝑥𝑥2 − 2𝑥𝑥1 − 4
𝜕𝜕𝑥𝑥2 𝜕𝜕𝑥𝑥2

No. of stationary points will be 3

4𝑥𝑥13 + 2𝑥𝑥1 − 4𝑥𝑥1 𝑥𝑥2 − 2𝑥𝑥2 + 4.5 = 0


𝐽𝐽 = 0 →
−2𝑥𝑥12 + 4𝑥𝑥2 − 2𝑥𝑥1 − 4 = 0

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
4𝑥𝑥13 + 2𝑥𝑥1 − 4𝑥𝑥1 𝑥𝑥2 − 2𝑥𝑥2 + 4.5 = 0
−2𝑥𝑥12 + 4𝑥𝑥2 − 2𝑥𝑥1 − 4 = 0

4𝑥𝑥2 = 2𝑥𝑥12 + 2𝑥𝑥1 + 4


𝑥𝑥2 = 0.5𝑥𝑥12 + 0.5𝑥𝑥1 + 1

4𝑥𝑥13 + 2𝑥𝑥1 − 4𝑥𝑥1 (0.5𝑥𝑥12 + 0.5𝑥𝑥1 + 1) − 2(0.5𝑥𝑥12 + 0.5𝑥𝑥1 + 1) + 4.5 = 0


4𝑥𝑥13 + 2𝑥𝑥1 − 2𝑥𝑥13 − 2𝑥𝑥12 − 4𝑥𝑥1 − 𝑥𝑥12 − 𝑥𝑥1 − 2 + 4.5 = 0
2𝑥𝑥13 − 3𝑥𝑥12 − 3𝑥𝑥1 + 2.5 = 0

𝑥𝑥1 = −1.05, 0.61, 1.94 𝑥𝑥2 = 1.03, 1.49, 3.85

𝑆𝑆1 = −1.05,1.03
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 = (𝑥𝑥1 , 𝑥𝑥2 ) = 𝑆𝑆2 = 0.61,1.49
𝑆𝑆3 = 1.94,3.85

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multi Variable Functions (Example):
12𝑥𝑥12 + 2 − 4𝑥𝑥2 −4𝑥𝑥1 − 2
𝐻𝐻 =
−4𝑥𝑥1 − 2 4

𝑆𝑆1 = −1.05,1.03 𝑆𝑆2 = 0.61,1.49 𝑆𝑆3 = 1.94,3.85

11.19 2.21 0.52 −4.45 31.79 −9.76


𝐻𝐻 = 𝐻𝐻 = 𝐻𝐻 =
2.21 4 −4.45 4 −9.76 4

𝐻𝐻1 = 11.19 > 0 𝐻𝐻1 = 0.52 > 0 𝐻𝐻1 = 31.79 > 0

11.19 2.21 0.52 −4.45 31.79 −9.76


𝐻𝐻2 = = 4.9 > 0 𝐻𝐻2 = = −17.7 < 0 𝐻𝐻2 = = 21.9 > 0
2.21 4 −4.45 4 −9.76 4

𝐻𝐻 = Positive definite 𝐻𝐻 = Indefinite 𝐻𝐻 = Positive definite

𝑆𝑆1 = Minima 𝑆𝑆2 = Saddle Point 𝑆𝑆3 = Minima


Dr. Muhammad Majid Gulzar (CIE-KFUPM)

You might also like