Mathematics For Economics (ECON 104)
Mathematics For Economics (ECON 104)
104)
2
Theorem: Suppose that f and g are continuously differentiable
functions, defined on an open convex subset S ⊆ R2. Suppose
that there exists a number λ∗ such that (x∗, y ∗) is a stationary
point of
L(x, y) = f (x, y) − λ∗(g(x, y) − c).
Lastly, suppose that g(x∗, y ∗) = c. Then
3
Example: Consider the consumer’s utility maximization prob-
lem:
max f (x, y) = xαy β subject to pxx + py y = I.
x,y
Assume that x ≥ 0, y ≥ 0, α > 0, β > 0 and α + β ≤ 1. Then, the
stationary point (from FOCs of the Lagrangian),
!
α I β I
(x∗, y ∗) = , ,
α + β px α + β p y
is a global max point. To see that, consider
4
Example (continued): Find
5
|HL(x, y)| = αβ [(α − 1) (β − 1) − αβ ] x2α−2y 2β−2
= αβ [1 − α − β ] x2α−2y 2β−2 ≥ 0.
6
Summary
• Without constraint:
Local Max or Min Global Max or Min
FOC fi0(x∗, y ∗) = 0 for all i fi0(x∗, y ∗) = 0 for all i
SOC H is ND or PD at (x∗, y ∗) H is NSD or PSD for all (x, y)
• With constraint:
Local Max or Min Global Max or Min
FOC L0i(x∗, y ∗) = 0 for all i L0i(x∗, y ∗) = 0 for all i
SOC |H̄| > 0 or < 0 at (x∗, y ∗) HL is NSD or PSD for all (x, y)
8
The Lagrangian method can be extended to a problem with three
variables and two equality constraints:
(
g 1(x, y, z) = c1
max f (x, y, z) subject to
x,y,z g 2(x, y, z) = c2.
9
The Lagrangian for this problem is
L = f (x, y, z) − λ1 g 1(x, y, z) − c 1 − λ2 g 2(x, y, z) − c 2 .
10
Theorem (Necessity for Lagrangian Method): Suppose that
f , g 1 and g 2 are continuously differentiable and (x∗, y ∗, z ∗) is an
interior point in the domain S and it is also a local maximum (or
minimum) point of f subject to g 1(x, y, z) = c1 and g 2(x, y, z) =
c2. Assume also that the following constraint qualification is
satisfied: for any feasible point (x, y, z) satisfying g 1(x, y, z) = c1
and g 2(x, y, z) = c2, two vectors ∇g 1(x, y, z) and ∇g 2(x, y, z) are
not parallel to each other.
Then, there exists a unique pair of numbers (λ1, λ2) such that
0 ∂f (x∗, y ∗, z ∗) ∂g 1(x∗, y ∗, z ∗) ∂g 2(x∗, y ∗, z ∗)
L1 = − λ1 − λ2 = 0;
∂x ∂x ∂x
0 ∂f (x∗, y ∗, z ∗) ∂g 1(x∗, y ∗, z ∗) ∂g 2(x∗, y ∗, z ∗)
L2 = − λ1 − λ2 = 0;
∂y ∂y ∂y
0 ∂f (x∗, y ∗, z ∗) ∂g 1(x∗, y ∗, z ∗) ∂g 2(x∗, y ∗, z ∗)
L3 = − λ1 − λ2 = 0.
∂z ∂z ∂z
11
Example: Consider the problem:
max f (x, y) = x + y
(x,y,z)
g 1(x, y, z) = x2 + 2y 2 + z 2 = 1
(
subject to
g 2(x, y, z) = x + y + z = 1.
Define
Since g 1 and g 2 are both the sum of polynomials, they are con-
tinuous. Thus, D1 and D2 are closed.
12
Define D = D1 ∩ D2, which is also closed. Why?
x2 + y 2 + z 2 = x2 + 2y 2 + z 2 − y 2 = 1 − y 2 ≤ 1 ⇒ (x, y, z) ∈ B
So, B contains D1 so that D1 is bounded.
13
To find the solutions, we set up the Lagrangian:
2 2 2
L(x, y, z, λ1, λ2) = x+y−λ1 x + 2y + z − 1 −λ2 (x + y + z − 1) .
x2 + 2y 2 + z 2 = 1 (4)
x + y + z = 1. (5)
14
From (1) and (2), we obtain
2λ1(x − 2y) = 0.
Then, we divide our argument into two cases: (i) λ1 = 0 or (ii)
x = 2y.
15
Case (ii) x = 2y: Plugging this relationship into the constraints
(4) and (5), we get
6y 2 + z 2 = 1 (6)
3y + z = 1. (7)
From these equations (6) and (7), we get
6y 2 + (1 − 3y )2 = 1
⇒ 15y 2 − 6y = 0
⇒ 3y(5y − 2) = 0.
Thus, y = 0 or y = 2/5.
16
We summarize:
1
y ∗ = 0, x∗ = 0, z ∗ = 1, λ∗1 = − , λ∗2 = 1
2
∗ 2 ∗ 4 ∗ 1 ∗ 1 ∗ 1
y = , x = , z = − , λ1 = , λ2 = .
5 5 5 2 5
Hence, (x∗, y ∗, z ∗) = 4, 2, −1
is the candidate for the maximum
5 5 5
point (note that the objective function is x + y).
17
Claim: Constraint qualification is satisfied
Compute
∂g 1/∂x ∂g 2/∂x
2x 1
∇g 1(x, y, z) = ∂g 1/∂y = 4y and ∇g 2(x, y, z) = ∂g 2/∂y = 1
∂g 1/∂z 2z ∂g 2/∂z 1
18
Since (x, y, z) is a feasible point, we must satisfy g 2(x, y, z) = 1.
This implies that (x, y, z) = (2/5, 1/5, 2/5).
Then, we confirm
4 2 4 2
g 1(x, y, z) = x2 + 2y 2 + z 2 = + + = 6= 1.
25 25 25 5
19
Example: Consider the problem:
min x2 + y 2 + z 2
x,y,z
(
x + 2y + z = 30
subject to
2x − y − 3z = 10.
We set up the Lagrangian:
L = x2 + y 2 + z 2 − λ1 (x + 2y + z − 30) − λ2 (2x − y − 3z − 10) .
The first-order conditions are
2x − λ1 − 2λ2 = 0 (1)
2y − 2λ1 + λ2 = 0 (2)
2z − λ1 + 3λ2 = 0, (3)
and the constraints are:
x + 2y + z = 30 and 2x − y − 3z = 10.
Each equation (1), (2), and (3) imply
1
x = (λ1 + 2λ2)
2
1
y = (2λ1 − λ2)
2
1
z = (λ1 − 3λ2).
2
Substituting these into two equality constraints yields
3
x + 2y + z = 3λ1 − λ2 = 30 (4)
2
3
2x − y − 3z = − λ1 + 7λ2 = 10. (5)
2
20
So, we have two unknowns (λ1, λ2) and two equations. (4) +
(5) ×2 yields λ2 = 4.
21
Claim: (x∗, y ∗, z ∗) is the global min point.
23
Many models in economics have inequality constraints.
Denote the optimal choice by (x∗, y ∗). Define f ∗(c) = f (x∗, y ∗).
Then, we already established the following interpretation of λ:
df ∗(c)
= λ.
dc
24
Case 1: g(x∗, y ∗) = c
25
Case 2: g(x∗, y ∗) < c
26
Theorem [Necessity of Kuhn-Tucker Conditions]: Suppose
that (x∗, y ∗) solves the problem:
max f (x, y) subject to g(x, y) ≤ c,
x,y
where f and g are continuously differentiable. The following
constraint qualification is satisfied: for any feasible point (x, y)
satisfying g(x, y) = c,
0 0
g1(x, y), g2(x, y) =6 (0, 0).
Form
L = f (x, y) − λ(g(x, y) − c).
Then, there is a unique λ∗ ∈ R such that
L01(x∗, y ∗) = 0,
L02 (x∗, y ∗) = 0,
λ∗ ≥ 0, g(x∗, y ∗) ≤ c and λ∗ [g(x∗, y ∗) − c] = 0.
27
Proof
29
Step 2: λ ≥ 0 and g(x∗, y ∗) ≤ c.
The best choice with more options cannot be worse than the
one with less options.
30
Step 3: λ[g(x∗, y ∗) − c] = 0.
And, if g(x∗, y ∗) < c (as in the second case of Step 1), then
λ = 0.
31
We reproduce the conditions:
L01(x∗, y ∗) = 0
L02 (x∗, y ∗) = 0
λ ≥ 0, g(x∗, y ∗) ≤ c and λ [g(x∗, y ∗) − c] = 0.
These (necessary) conditions are known as the Kuhn-Tucker
conditions, after Harold W. Kuhn and Albert W. Tucker.
32
Sufficiency of Kuhn-Tucker Conditions: Single Vairable
(Ch. 14.9)
00
• if Lxx(x, λ) ≤ 0 for “all x” where λ is obtained by the Kuhn-
Tucker conditions, then x∗ is the global max.
00
• if Lxx(x, λ) ≥ 0 for “all x” where λ is obtained by the Kuhn-
Tucker conditions, then x∗ is the global min.
33
Sufficiency of Kuhn-Tucker Conditions: One Constraint
(Ch. 14.9)
L0x = −2 (x − 2) + λ = 0
λ ≥ 0, x ≥ 1 and λ(1 − x) = 0.
35
From λ(1 − x) = 0, we have either λ = 0 or x = 1. Consider two
cases:
Case 1: λ = 0.
We have 2(x − 2) = 0 ⇒ x = 2.
Case 2: x = 1.
00
Since Lxx(x, λ∗ = 0) = −2 < 0 for all x, x∗ = 2 is the solution to
the constrained optimization problem.
36
Example 2: (one variable and one constraint) Consider the
problem:
max −(x − 2)2 subject to x ≥ 3.
x
The Lagrangian is
L0x = −2 (x − 2) + λ = 0
λ ≥ 0, x ≥ 3 and λ(3 − x) = 0.
From λ(3 − x) = 0, we have either λ = 0 or x = 3.
37
Case 1: λ = 0
Case 2: x = 3
00
Since Lxx(x, λ∗ = 2) = −2 < 0 for all x, x∗ = 3 is the solution to
the constrained optimization problem.
38
Example 3: (two variable and one constraint) Consider the
problem:
max xy s.t. x2 + y 2 ≤ 1.
x,y
Set up the Lagrangian:
L01 = y − 2λx = 0
L02 = x − 2λy = 0
λ ≥ 0, x2 + y 2 ≤ 1 and λ(x2 + y 2 − 1) = 0.
39
Case 1: λ = 0
Case 2: x2 + y 2 = 1
40
Now the candidates are:
1 1 1
x = √ , y =√ , λ=
2 2 2
1 1 1
x = −√ , y = −√ , λ =
2 2 2
1 1 1
x = √ , y = −√ , λ = −
2 2 2
1 1 1
x = −√ , y=√ , λ=− .
2 2 2
41
Hence, we have three candidates that satisfy K-T conditions:
x∗ = 0, y ∗ = 0, λ∗ = 0
1 1 1
x∗ = √ , y ∗ = √ , λ∗ =
2 2 2
∗ 1 ∗ 1 ∗ 1
x = −√ , y = −√ , λ = .
2 2 2
Since the objective function is xy, the solutions are the last two.
42
Define
D = {(x, y) ∈ R2| x2 + y 2 ≤ 1}
D is bounded, as it is a closed ball itself. Let g(x, y) = x2 + y 2.
Since g(x, y) is the sum of polynomials, it is continuous. Hence,
D is closed. Thus, D is compact. f (x, y) = xy is the product of
polynomials so that f is continuous.
√ √ √ √
We thus conclude that (x∗, y ∗) = (1/ 2, 1/ 2), (−1/ 2, −1/ 2)
are the solutions and f achieves 1/2 as the maximum value.
43
m Inequality Constraints where m ≥ 2
(Ch. 14.9)
44
Consider the problem:
For any feasible point (x, y), the constraint qualification requires
us to check the following three cases:
46
Case 2: |J(x, y)| = 1, i.e., only one constraint is binding.
47
We set up the Lagrangian:
48
Minimization with Inequality Constraints
49
Sufficiency of Kuhn-Tucker Conditions: m Constraints
(Ch. 14.9)