Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
95 views
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Download
Save
Save Multi unconstrained optimisation For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
95 views
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Carousel Previous
Carousel Next
Save
Save Multi unconstrained optimisation For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 12
Search
Fullscreen
Analytical Multidimensional (Multivariable) Unconstrained Optimization Learning Objectives After studying this chapter, you should be able to: 1. Describe the method of solving problems of multivariable unconstrained optimization 2. Classify the multivariable optimization problems. 3. Explain the necessary and sufficient conditions for solving multivariable unconstrained optimization problems. 4, Solve unconstrained multivariable functions. In Chapter 5, we formulated the single variable optimization problems without constraints. Now let us extend those concepts to solve multivariable optimization problems without constraints. The optimization of such problems is routed in more than one direction. For instance, if there are two variables in the objective function, we call such problems two-dimensional as we have to search from two directions. Similarly, a three-variable problem can be called three-dimensional, and so on. However, the method of solving is similar in all such cases, and, therefore, all thes? problems can be named multidimensional ot multivariable unconstrained optimization problems Recollect that we made use of complete differentiation to solve single variable unconstrai optimization problems. Here we use the concept of partial differentiation in solving multivariable unconstrained optimization problems using analytical methods. 6.1 Classification of Multivariable Optimization Problems The multivariable optimization problems can be classified broadly into two catego, multivariable unconstrained optimization, i.e. without constraints, and multivariable const@i™ optimization, i.e. with constraints. Further, since we categorize the constraints into 1¥° = 114Analytical Multidimensional (Multivariable) Uncomstrained Optimization 145 such as equality and inequality type we can further classify the multivariable constrained optimization problems into two types. Summarily, the classification of multivariable optimization problems is shown in Figure 6.1. Multivariable optimization problerns Multivariable Multivariable constrained optimization, i.e, without constraints i.e. with constraints Equality constraints Inequality constraints Figure 6.1 Classification of multivariable optimization problems. In this chapter, we discuss the analytical solution methods for multivariable unconstrained optimization. 6.2 Optimization Techniques to Solve Unconstrained Multivariable Functions Let us now discuss the necessary and sufficient conditions for optimization of an unconstrained (without constraints) multivariable function for which we make use of Taylor’s jes expansion of a multivariable function. Taylor’s theorem/series expansion is given in Exhibit 6.1. Exhibit 6.1 Taylor’s Theorem/Series Expansion Taylor’s Series In the Taylor’s formula with remainder (Eqs. (i) and (ii), if the remainder R,(x) > 0 as n >, then we obtain as 7 ine f0)= f+ 2 a2 HP par EF faye Wi [Which is called the Taylor’s series. When a = 0, we obtain the Maclaurin’s series 2 7 2 q as is assumed that f(x) has continuous derivatives up to (n + S)th order, /” * M(x) is in the interval (a, x), Hence, to establish that lim | R, (2) |= 0, it is sufficient to show nm that fim 2— at = Geli =0 for any fixed numbers x and a, Now, for any fixed numbers x and a, Sa) = f0)+=7'O) + L140) fee (i)eo7- Analytical Multidimensional (Multivariable) Unconstrained Optimization 117 6.3 Necessary Condition (Theorem 6.1) If fx) has an extreme point (maximum or minimum) at x = x” and if the first derivative of fox) exists at x°, then He’) _ He’) _ _ Ao" Ox, Ox, ox,64 Sufficient Condition (Theorem 6.2) ‘Acsufficient condition for a stationary point x” to be an extreme point is that the matrix of second partial derivatives (Hessian matrix) of f(x) evaluated at x" is: (i) Positive definite when x* is a relative minimum point. (i) Negative definite when x* is a relative maximum point.6.5 Working Rule (Algorithm) for Unconstrained Multivariable Optimization From Theorems 6.1 and 6.2, we can obtain the following procedure to solve the problems of multivariable functions without constraints. Step 1: Check the function to ensure that it belongs to the category of multivariable unconstrained optimization problems, i.e. it should contain more than one variable, SAY, X,, Xp... X, and no constraints. Step 2: Find the first partial derivatives of the function wart. x1, x9, ...5 X,-step 3 step 4 step 5¢ Step 6: Step 7: Analytical Multidimensional (Multivariable) Unconstrained Optimization 119 Equate all the a partial derivatives (obtained from Step 2) to zero to find the values fps 2525 yt 3 Find all the second partial derivatives of the function. Prepare the Hessian matrix. For example, if we have two variables x,, x,, in f(x), then the Hessian matrix is er _#s J ax? = ax ax ra) ae ay Ox Ox. Oxy Find the values of determinants of square sub-matrices Ji Jy» ” 2 In our example, J, --4 as er es = y-| Bas af oF axjax, ax? Evaluate whether x;, x3 are relative maxima or minima depending on the positive or negative nature of J,, J, and so on. Thus, the sign of J, decides whether J is positive or negative while the sign of J, decides definite or indefiniteness of J (Table 6.1). More clearly, If (i) J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. (i) J, > 0, J, <0, then J is indefinite and x" is a saddle point. (ii) J, <0, Fe > 0, then J is negative definite and x” is a relative maximum. (iv) J, <0, z <0, then J is indefinite and x* is a saddle point. Table 6.1 presents the sign notation of J. Table 6.1 Sign Notation of J negative or J, <0 (is negative) J, positive orJ,>0 | Jis positive definite andx" | Jis negative definite and is definite) is a relative minimum ¥ isa relative maximum Jznegative or J,<0 | Sis indefinite and.x’ isa | Jis indefinite and x" isa Vis indefinite)” saddle point saddle point120 Optimization Methods for Engineers Note: If in a function of two variables f(x,, *2) the Hessian — is neither positive 7 negative definite, then the point (x1, 33) at which a//x, = 2/10 is called the saddle yy, which means it may be a maximum or minimum with one variable when the other js fixed.” Ilustration 6.2 Determine the extreme points as well as evaluate the following function fo) = xB + xB + 2} + ay + 6 Solution The necessary conditions for the existence of an extreme point are af | ae 0 and ae 0 of Z as = 3x? + 4x, = 23x, + 4) = 0 i of aes 3x? + 8x, = x,3x, + 8) = 0 (i) From Eq. (i), we get x, = 0 or -4 And from Eq. (ii), we get x, = 0 or + Hence, the above equations are satisfied by the following points: (©, 0), (0, -8/3), (4/3, 0) and (-4/3, 8/3) Now, let us find the nature of these extreme points for which we use the sufficiency conditions, i.e. by the second order partial differentiation of f. 2 os = 6x, +4 ax 2 f > = 6x, + ae at 8 ef dxjax, The Hessian matrix of f is given by pe 6x, +4 0 0 6x, +8 Hence, J, = (6x, + 4] and ji pae eae 0 6x, +8Analytical Multidimensional (Multivariable) Unconstrained Optimization 121 The values of J, and J, and the nature of the extreme points are given in Table 6.2. Table 6.2 Sign Notation of J in Illustration 6.2 Pointy Valueof Value of Nature of 7 Nature of x i) I ta (0,0) +4 +32 Positive definite Relative minimum 6 (0,-8/3) +4 ~32 Indefinite Saddle point since 418/27 Ofldx,= af/dx, (4/3, 0) 4 32 Indefinite Saddle point 194/27 (43, -8/3) 4 +32 Negative definite Relative maximum 50/3 of aq 728A 0 = H=0 ef ax? Ber 0 Therefore, the Hessian matrix J = (° 4 Thus at x, = 0, x, = 0. 0 2 Since J, is positive (+2) and J, is negative (~4), the nature of J'is indefinite and the point at Gas "03 is a saddle point, and also the value the function at this saddle point is (0,0) = 0. We have J, = |2| = 2 and J, = | =-4 qs Note: At the saddle point if one of the values of x, or x, is fixed, the other may show extremity.Ilustration 6.5 Two frictionless rigid bodies R, and R, are connected by three linear elastic springs of constants k,, k, and k, as shown in Figure 6.3. The springs are at their natural positions when the applied force P is zero. Find the displacements x, and x, under the force of 26 units by using the principle of minimum potential energy assuming the spring constants fy, ky and k, as 2, 3 and 4 respectively. Examine whether the displacements correspond to the minimum potential energy.
0Analytical Multidimensional (Multivariable) Unconstrained Optimization 127 4) 6 =42+16=58>0 Since the values of J are positive, the matrix J is positive definite and hence the values ay x5 correspond to the minimum potential energy. and and xy =4 and x3 = 7. Illustration 6.6 The profit on a contract of road laying is estimated as 26x, + 20x, + 4x,x, = 3xj ~ 4xj, where x, and x, denote the labour cost and the material cost respectively. Find the values of labour cost and material cost that can maximize the profit on the contract. Solution Given, profit Z(x,, x,) = 26x, + 20x, + dx,xy — 3x? - 4x? Applying the necessary conditions, ie. a a, ax, Oxy az M By 26 te 6H =0 @ and Seaman -8x, =0 () On solving Eqs. (i) and (ii), we get x? = 9, x} = 7. Now, let us apply the sufficiency condition to maximize the profit (Z) at (xf, x1). This is verified by testing the negative definiteness of the Hessian matrix. The Hessian matrix of Z at (31, 23) is calculated as 4 az 4 =-6, =-8 and ax? ax? Ax,dx, si were pte elon . Hessian matrix ly The determinants of the square sub-matrices of J are -~6|=-6<0 48-16=32>0 Since J, is negative (i.e. -6) and J, is positive (i.e. +32) which is definite, the nature of J is "tative definite. Hence, the values of x, and x, show a relative maximum for Z at x and x3 Tespectively, 4 =O ET128 Optimization Methods for Engineers Hence max profit, Zax. = 20(9) + 20(7) + 4(9(7) - 309 — 4¢7 = 234 + 140 + 252 — 243 — 196 = 187 units Summary aaa Multivariable optimization problems without constraints are explained in this chapter, The optimization problems of multivariable functions without constraints make use of Taylor's series expansion of multivariable functions and partial differentiations. The basic idea behind the solution methods of these problems is trying to convert them nearest to the single variable problems, i.e. the partial differentiation which is the differentiation of one variable at a time keeping the other variables fixed. However, the mathematical concepts of matrices, calculus and other simple algebraic principles are prerequisites to these problems. We will take up the multivariable constrained problems in the next chapter. Key Concepts Multidimensional optimization: The optimization problems routed in more than one direction, ic. having more than one variable. Saddle point: If there is a function of two variables /(x,, x,) whose Hessian matrix is neither positive definite nor negative definite, then the point (x;, x3) at which 9f/ax," = 9f/2x) is called the saddle point. Hessian matrix: J, _ ,. = [0°//2xx,_,.] is the matrix of the second partial derivative. Positive definite: If J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. Positive indefinite: If J,> 0, J,< 0, then J is indefinite and x" is a saddle point. Negative definite: If J, < 0, J, > 0, then J is negative definite and x* is a relative maximum. Negative indefinite: If J, < 0, J, < 0, then J is indefinite and x* is.a saddle point,
You might also like
2 Simplex Exercises
PDF
No ratings yet
2 Simplex Exercises
5 pages
Literature Review
PDF
No ratings yet
Literature Review
17 pages
Hw2sol PDF
PDF
No ratings yet
Hw2sol PDF
15 pages
Classical Optimization Technique
PDF
No ratings yet
Classical Optimization Technique
19 pages
Golden Search and Quadratic Estimation
PDF
No ratings yet
Golden Search and Quadratic Estimation
8 pages
Keller Box Shooting Method
PDF
No ratings yet
Keller Box Shooting Method
30 pages
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
PDF
100% (6)
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
40 pages
EngMath4 Chapter12
PDF
No ratings yet
EngMath4 Chapter12
39 pages
2020-2021 EDA 101 Handout
PDF
No ratings yet
2020-2021 EDA 101 Handout
192 pages
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
PDF
No ratings yet
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
9 pages
Lazy Learning (Or Learning From Your Neighbors)
PDF
No ratings yet
Lazy Learning (Or Learning From Your Neighbors)
3 pages
07 08 Simpson 3-8 Integration Method
PDF
No ratings yet
07 08 Simpson 3-8 Integration Method
10 pages
Numerical - Jacobi and Gauss-Seidel Linear
PDF
No ratings yet
Numerical - Jacobi and Gauss-Seidel Linear
3 pages
Assignment 1 Answers
PDF
No ratings yet
Assignment 1 Answers
7 pages
DAE17
PDF
No ratings yet
DAE17
59 pages
03 Iterative Methods PDF
PDF
No ratings yet
03 Iterative Methods PDF
19 pages
Parameter Estimation For The Two-Parameter Weibull Distribution
PDF
No ratings yet
Parameter Estimation For The Two-Parameter Weibull Distribution
108 pages
TOYCO
PDF
No ratings yet
TOYCO
7 pages
Tugas 1 - PPS
PDF
No ratings yet
Tugas 1 - PPS
3 pages
QM Graphical
PDF
No ratings yet
QM Graphical
6 pages
Suggested Solution To Assignment 1 - MATH4220
PDF
No ratings yet
Suggested Solution To Assignment 1 - MATH4220
10 pages
Soal Statistika
PDF
No ratings yet
Soal Statistika
8 pages
Week 3 Modifikasi Posisi palsu-dan-Metode-Newton 9543 0
PDF
No ratings yet
Week 3 Modifikasi Posisi palsu-dan-Metode-Newton 9543 0
14 pages
DOTcvp
PDF
No ratings yet
DOTcvp
66 pages
Simpson's 1-3rd Rule
PDF
No ratings yet
Simpson's 1-3rd Rule
4 pages
Combination of Variable Sol.
PDF
No ratings yet
Combination of Variable Sol.
4 pages
Linear Programming Assignment 5: 1 Chapter 17 (Section 17.2)
PDF
No ratings yet
Linear Programming Assignment 5: 1 Chapter 17 (Section 17.2)
4 pages
Probability and Statistics Sheet: A A S, T & M T
PDF
No ratings yet
Probability and Statistics Sheet: A A S, T & M T
57 pages
ILL Conditioned Systems
PDF
No ratings yet
ILL Conditioned Systems
5 pages
PDE Heat Equation
PDF
No ratings yet
PDE Heat Equation
71 pages
Regula Falsi Method Theory
PDF
100% (1)
Regula Falsi Method Theory
3 pages
Section5 - 7 - Hermite Interpolation PDF
PDF
No ratings yet
Section5 - 7 - Hermite Interpolation PDF
17 pages
Spectral Methods PDF
PDF
100% (1)
Spectral Methods PDF
28 pages
Goal Programming Goal Programming: DR Muhammad Al Salamah, Industrial Engineering, KFUPM
PDF
No ratings yet
Goal Programming Goal Programming: DR Muhammad Al Salamah, Industrial Engineering, KFUPM
33 pages
Optimization For Data Science Assignment-1
PDF
No ratings yet
Optimization For Data Science Assignment-1
24 pages
Download Full Modeling and Analysis of Stochastic Systems 3rd Kulkarni Solution Manual PDF All Chapters
PDF
100% (19)
Download Full Modeling and Analysis of Stochastic Systems 3rd Kulkarni Solution Manual PDF All Chapters
69 pages
Syntax SAS Untuk Metode Fungsi Transfer Multi Input Dengan Deteksi Outlier
PDF
No ratings yet
Syntax SAS Untuk Metode Fungsi Transfer Multi Input Dengan Deteksi Outlier
7 pages
Pole Placement1
PDF
100% (2)
Pole Placement1
46 pages
Group 4 - Summary PSM
PDF
No ratings yet
Group 4 - Summary PSM
33 pages
Least Square Fit
PDF
100% (2)
Least Square Fit
16 pages
Newton Gauss Method
PDF
No ratings yet
Newton Gauss Method
37 pages
A Simple Yet Efficient Two Step Fifth Order Weighted Newton Method For Nonlinear Models
PDF
No ratings yet
A Simple Yet Efficient Two Step Fifth Order Weighted Newton Method For Nonlinear Models
23 pages
Chapter 7 Dimensional Analysis Modeling and Similitude PDF
PDF
100% (1)
Chapter 7 Dimensional Analysis Modeling and Similitude PDF
22 pages
Design of Experiments Workshop
PDF
No ratings yet
Design of Experiments Workshop
13 pages
Nilai Harapan Dan MGF Bersamaa
PDF
No ratings yet
Nilai Harapan Dan MGF Bersamaa
38 pages
Finding Interval Estimators
PDF
No ratings yet
Finding Interval Estimators
13 pages
Homework 5 Solution
PDF
No ratings yet
Homework 5 Solution
7 pages
Função Matlab: Fmincon
PDF
No ratings yet
Função Matlab: Fmincon
3 pages
Jurnal Metode Numerik
PDF
No ratings yet
Jurnal Metode Numerik
8 pages
EC744 Lecture Note 3 Dynamic Programming Under Certainty: Prof. Jianjun Miao
PDF
No ratings yet
EC744 Lecture Note 3 Dynamic Programming Under Certainty: Prof. Jianjun Miao
17 pages
Extrapolation
PDF
No ratings yet
Extrapolation
3 pages
Newtons Difference Table Matlab Code
PDF
No ratings yet
Newtons Difference Table Matlab Code
3 pages
Euler's Method:Method of Solving Differential Equations of The First Order (The Easy Ones)
PDF
No ratings yet
Euler's Method:Method of Solving Differential Equations of The First Order (The Easy Ones)
12 pages
16.3.1 - Homogeneous Equations With Constant Coefficients Second Order
PDF
100% (1)
16.3.1 - Homogeneous Equations With Constant Coefficients Second Order
16 pages
Various Fuzzy Numbers and Their Various Ranking Approaches
PDF
No ratings yet
Various Fuzzy Numbers and Their Various Ranking Approaches
10 pages
HW 5 Sol
PDF
No ratings yet
HW 5 Sol
9 pages
N Modular Redundancy
PDF
No ratings yet
N Modular Redundancy
15 pages
1 Mathematical Preliminaries 2
PDF
No ratings yet
1 Mathematical Preliminaries 2
17 pages
Module - 2 Lecture Notes - 3 Optimization of Functions of Multiple Variables: Unconstrained Optimization
PDF
No ratings yet
Module - 2 Lecture Notes - 3 Optimization of Functions of Multiple Variables: Unconstrained Optimization
4 pages