0% found this document useful (0 votes)
48 views20 pages

확률과 통계

(1) The document discusses bivariate and multivariate random variables, which extend the theory of single random variables. (2) It defines the joint cumulative distribution function (CDF) of bivariate random variables as the probability that both variables are less than or equal to given values. (3) For discrete random variables, it defines the joint probability mass function (PMF), which gives the probability of two variables taking specific values. Marginal distributions can be obtained by setting one variable to infinity.

Uploaded by

김채호
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views20 pages

확률과 통계

(1) The document discusses bivariate and multivariate random variables, which extend the theory of single random variables. (2) It defines the joint cumulative distribution function (CDF) of bivariate random variables as the probability that both variables are less than or equal to given values. (3) For discrete random variables, it defines the joint probability mass function (PMF), which gives the probability of two variables taking specific values. Marginal distributions can be obtained by setting one variable to infinity.

Uploaded by

김채호
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

확률 및 통계(Probability & Engineering Statistics)

Multiple Random Variables

5.1 Introduction

 5.1 Introduction

◈ Chap1~4 : single random variable defined on a given sample space


◈ Chap 5 : multivariate systems (2 variable  bivariate random variables)
◈ Single random variable은 random한 문제를 다루는 powerful tool이지만, 보통 공학적인 문제는 2차원 이상의
다차원 문제가 많으므로 single R.V.을 확장해야 함.
예) 궁수가 화살로 과녁 맞추기
- Fortunately, many situations of interest in engineering can be handled by the theory of two R.V.
- 시간영역으로 확장하면, autocorrelation, cross correlation, covariance 등과 밀접하게 연관됨.
◈ Random Variable  Multiple Random Variable (random Vector)  Random Process

 5.2 Joint CDFs of Bivariate Random Variables (or Vector Random Variables)

◈ Consider two random variables X and Y defined on the same sample space
- 예) X : 학생들의 성적, Y : 같은 학생들의 키

◈ Joint CDF(결합확률분포함수) of bivariate random variables


FX ,Y ( x, y ) = P[ X ≤ x, Y ≤ y ] FX ( x) = P[ X ≤ x]
Marginal CDF
FY ( x) = P[Y ≤ y ]
◈ X와 Y가 독립인 경우
FX ,Y ( x, y ) = FX ( x) FY ( y )
Multiple Random Variables

5.2 Joint CDFs of Bivariate Random Variables

 Vector Random Variable


◈ Mapping from the sample space S to the joint sample space SJ

Joint sample space (결합표본공간)


~ range sample space (치역표본공간)
~ 2-dimensional product space (2차원곱공간)

 Mapping 함수가
2개임을 기억할 것!

◈ Event A와 Event B를 정의하자.

A = { X ≤ x}
B = {Y ≤ y}

Joint Event A ∩ B = { X ≤ x and Y ≤ y}

◈ N차원으로 확장 가능하다. (우선은 2차원에만 집중하자!)


Multiple Random Variables

5.2 Joint CDFs of Bivariate Random Variables

 Joint Distribution Function (결합분포함수)


◈ Event A and B

A = { X ≤ x} FX ( x) = P{ X ≤ x}
B = {Y ≤ y} FY ( y ) = P{Y ≤ y}

◈ The probability of the joint event -> joint probability distribution function (결합확률분포함수)

Joint Event A ∩ B = { X ≤ x and Y ≤ y} FX ,Y ( x, y ) = P{ X ≤ x and Y ≤ y} = P( A ∩ B)

◈ Example)
- SJ (Joint Sample Space) : (1,1), (2,1), and (3,3)
- Probability : P(1,1)=0.2, P(2,1)=0.3, and P(3,3)=0.5
Find FX,Y(x,y)
Multiple Random Variables

5.2 Joint CDFs of Bivariate Random Variables

 Joint Distribution
◈ Example) A fair coin is tossed twice.  S={(H,H), (H,T), (T,H), (T,T)}, 확률 { ¼, ¼, ¼, ¼ }
X = “number of heads on the first toss”, Y=“number of heads on the second toss”
- mapping : Heads를 1로, Tail을 0으로

- PDF

- CDF 계산

Independent
- marginal pdf
f X ( x) = 2 / 4δ ( x) + 2 / 4δ ( x − 1)
f Y ( y ) = 2 / 4δ ( y ) + 2 / 4δ ( y − 1)
Multiple Random Variables

5.2 Joint CDFs of Bivariate Random Variables

 Properties of the Joint Distribution Function


◈ 일차인 경우와 비교하여 설명하면,
: 공집합과의 교집합은 zero이므로

(1) Fx ( −∞) = 0 (4,5) FX ,Y (−∞, ∞) = 0, FX ,Y (−∞, y ) = 0, FX ,Y ( x,−∞) = 0


(2) Fx (∞) = 1 (3) FX ,Y (∞, ∞) = 1
(3) 0 ≤ Fx ( x) ≤ 1 (1) 0 ≤ FX ,Y ( x, y ) ≤ 1
(4) Fx ( x1 ) ≤ Fx ( x2 ) if x1 ≤ x2
(2) FX ,Y ( x, y ) is a nondecreasing function of both x and y
(5) P{x1 ≤ X ≤ x2 } = Fx ( x2 ) − Fx ( x1 )
(8) P{x1 < X ≤ x2 , Y ≤ y} = FX ,Y ( x2 , y ) − FX ,Y ( x1 , y )
(6) Fx ( x + ) = Fx ( x)
(9) P{ X ≤ x, y1 < Y ≤ Y2 } = FX ,Y ( x, y 2 ) − FX ,Y ( x, y1 )

(10) P{x1 < X ≤ x2 , y1 < Y ≤ Y2 } = FX ,Y ( x2 , y 2 ) − FX ,Y ( x1 , y 2 ) − FX ,Y ( x2 , y1 ) + FX ,Y ( x1 , y1 )


(6) lim+ FX ,Y ( x, y ) = FX ,Y (a, y ) : 앞에 2개에서
x→a 중복되어 뺐으므로
(7) lim+ FX ,Y ( x, y ) = FX ,Y ( x, b) 더해줌(교집합 고려)
y →b FX ,Y ( x, ∞) = FX ( x), FX ,Y (∞, y ) = FY ( y ) marginal distribution functions 개념

Sample space S 전체이므로 S와 x의 교집합은 x 만 남음

◈ Example 5.1) 그림을 그려서 이해할 것! P{ X > a and Y > b} = P{ X > a ∩ Y > b}
= 1 − P{ X ≤ a ∪ Y ≤ b}
= 1 − FX ( x) − FY ( y ) + FX ,Y ( x, y )
Multiple Random Variables

5.3 Discrete Random Variables

 Discrete Random Variables


◈ Joint PMF (or Joint PDF) : 결합 밀도함수 p X ,Y ( x, y ) = P[ X = x, Y = y ]

◈ 결합밀도함수의 특성
1) PMF는 negative가 될 수도 1보다 클 수도 없음 0 ≤ p X ,Y ( x, y ) ≤ 1
2) 결합밀도함수의 총합은 1

∑∑ p
x y
X ,Y ( x, y ) = 1

3) 결합밀도함수를 a와 b까지 더한 값은 CDF가 됨 ∑∑ p


x ≤ a y ≤b
X ,Y ( x, y ) = FXY (a, b)

◈ Example 5.2) 결합밀도함수가 다음과 같을 때 k값은?


p X ,Y ( x, y ) = k (2 x + y ) x = 1,2; y = 1,2 4k
6k
- 18k=1이어야 하므로 k=1/18
3k
- marginal PMF는?
pX(x) = {7/18 at x=1,11/18, at x=2} 5k
pY(y) = {8/18 at y=1,10/18, at y=2}
- 독립인가?
x=1, y=1일 때, pXY(x,y)=3/18, pX(x) = 7/18 , pY(y) = 8/18이므로 두 값이 곱과 다르므로 dependent

◈ Example 5.4) 그림을 그려서 이해할 것!


Multiple Random Variables

5.3 Discrete Random Variables

 Discrete Random Variables


◈ Example) Discrete Random Variable N M
- SJ (Joint Sample Space) : (1,1), (2,1), and (3,3) FX ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )u ( x − x n )u ( y − ym )
- Probability : P(1,1)=0.2, P(2,1)=0.3, and P(3,3)=0.5
Find FX,Y(x,y)

N M
FX ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )u ( x − x n )u ( y − y m ) = P (1,1)u ( x − 1)u ( y − 1) + P(2,1)u ( x − 2)u ( y − 1) + P(3,3)u ( x − 3)u ( y − 3)

N M
f X ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )δ ( x − x n )δ ( y − y m ) = P(1,1)δ ( x − 1)δ ( y − 1) + P(2,1)δ ( x − 2)δ ( y − 1) + P(3,3)δ ( x − 3)δ ( y − 3)
Multiple Random Variables

5.3 Discrete Random Variables

 Marginal Distribution Functions (부분분포함수)

◈ The distribution function of one R.V. can be obtained by setting the value of the other variable to infinity.

◈ Example) 앞의 예제 에서
N M
FX ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )u ( x − x n )u ( y − ym )

= P (1,1)u ( x − 1)u ( y − 1) + P (2,1)u ( x − 2)u ( y − 1) + P(3,3)u ( x − 3)u ( y − 3)

FX ( x) = FX ,Y ( x, ∞) = P(1,1)u ( x − 1) + P(2,1)u ( x − 2) + P (3,3)u ( x − 3)

FY ( y ) = FX ,Y (∞, y ) = P(1,1)u ( y − 1) + P(2,1)u ( y − 1) + P(3,3)u ( y − 3)


= 0.2u ( y − 1) + 0.3u ( y − 1) + 0.5u ( y − 3) = 0.5u ( y − 1) + 0.5u ( y − 3)
Multiple Random Variables

5.4 Continuous Random Variables

 Joint Density Function (결합밀도함수)


◈ Joint density is defined by the 2nd derivative of the joint distribution function

∂ 2 FX ,Y ( x, y )
f X ,Y ( x, y ) = Joint Density Function N차 미분까지 개념 확장 가능
∂x∂y

N M
f X ,Y ( x, y ) = ∑∑ P( x , y
n =1 m =1
n m )δ ( x − x n )δ ( y − ym ) : 불연속인 경우 delta함수를 이용하여 정의

◈ Joint Density Function의 특성


x ∞

(1) f X ,Y ( x, y ) ≥ 0 ∫ ∫
( 4) F X ( x ) =
−∞ −∞
f X ,Y ( x, y )dxdy
y ∞
∞ ∞ F ( y) = ∫ ∫ f X ,Y ( x, y )dxdy
∫ ∫ f X ,Y ( x, y )dxdy = 1 Y
(2) −∞ −∞
−∞ −∞
x y
(3) FX ,Y ( x, y ) = ∫ ∫
−∞ −∞
f X ,Y ( x, y )dxdy
x2 y2
∫ ∫

(5) P{x1 < x ≤ x2 , y1 < y ≤ y 2 } =
x1 y1
f X ,Y ( x, y )dxdy (6) f X ( x) = ∫ −∞
f X ,Y ( x, y )dy

fY ( y) = ∫ f X ,Y ( x, y )dx
−∞
Multiple Random Variables

5.4 Continuous Random Variables

 Joint Density Function

◈ Example) b가 양수일 때 다음 함수가 pdf가 되려면?

be − x cos( y ), 0 ≤ x ≤ 2 and 0 ≤ y ≤ π / 2


g ( x, y ) = 
 0, all other x and y

(1) f X ,Y ( x, y ) ≥ 0
∞ ∞
(2) ∫ ∫
−∞ −∞
f X ,Y ( x, y )dxdy = 1

π /2 2 2 π /2
∫ ∫
0 0
be − x cos( y )dxdy = b ∫
0 ∫
e − x dx cos( y )dy
0 1
−2
= b(1 − e ) = 1

b = 1 /(1 − e −2 )
Multiple Random Variables

5.4 Continuous Random Variables

 Marginal Probability Density Function (부분확률밀도함수)


◈ Marginal probability density function


( 6) f X ( x ) = ∫ −∞
f X ,Y ( x, y )dy

fY ( y) = ∫ f X ,Y ( x, y )dx
−∞

Marginal pdf

◈ Example 5.5) 다음 함수로부터 marginal PDF를 구하고, 이를 통해 independent인지 보이시오.

f X ,Y ( x, y ) = u ( x)u ( y )e − ( x + y )

∞ ∞
f X ( x) = ∫ −∞
f X ,Y ( x, y )dy = u ( x)e − x ∫ 0
e − y dy = u ( x)e − x

∞ ∞
f Y ( x) = ∫ −∞
f X ,Y ( x, y )dx = u ( y )e − y ∫ 0
e − x dx = u ( y )e − y
Multiple Random Variables

5.4 Continuous Random Variables

 Statistical Independence
◈ 확률에서 통계적 독립이란?

P( A | B) = P( A), P( B | A) = P( B) -> When events are independent, it will often be found


P( A ∩ B) = P( A) P( B) that probability problems are greatly simplified.

◈ Random Variable에서는

P{ X ≤ x, Y ≤ y} = P{ X ≤ x}P{Y ≤ y}
pdf와 cdf에 대해서 정리하면 FX ,Y ( x, y ) = FX ( x) FY ( y ), f X ,Y ( x, y ) = f X ( x) f Y ( y )

◈ Example) 다음 함수의 경우는 “not independent”


∞ ∞
f X ( x) = ∫ −∞
f X ,Y ( x, y )dy = u ( x) xe − x ∫0
e − xy dy = u ( x)e − x
− x ( y +1)
f X ,Y ( x, y ) = u ( x)u ( y ) xe ≠ ∞ ∞
∫ ∫
u( y)
f Y ( x) = f X ,Y ( x, y )dx = u ( y ) xe − x e − xy dx =
−∞ 0 ( y + 1) 2
◈ Example) 다음 함수의 경우는 “independent”
∞ ∞

1
f X ( x) = ∫ −∞
f X ,Y ( x, y )dy = u ( x)e −( x / 4) ∫0
e −( y / 3) dy = u ( x)e −( x / 4)
f X ,Y ( x, y ) = u ( x)u ( y )e −( x / 4) −( y / 3) = ∞ ∞
12 fY ( y) = ∫ f X ,Y ( x, y )dyx = u ( x)e −( y / 3) ∫ e −( x / 4) dyx = u ( x)e −( y / 3)
−∞ 0
Multiple Random Variables

5.5 Determining Probabilities from a Joint CDF

 5.5 Determining Probabilities from Joint CDF

◈ Example 5.7) 그림을 그려서 이해할 것!


Multiple Random Variables

5.6 Conditional Distributions

 5.6.1 Conditional PMF for Discrete Random Variables

◈ 조건 확률

P( A ∩ B) The probability of an event A may depend on a second event B


P( A | B) =
P( B) If A and B are mutually exclusive, P(A|B)=0

◈ Conditional PMF

P[ X = x, Y = y ] p XY ( x, y ) P[ X = x, Y = y ] p XY ( x, y )
pY | X ( y | x) = = p X |Y ( x | y ) = =
P[ X = x] p X ( x) P[Y = y ] pY ( y )

1
◈ Example 5.8) p X ,Y ( x, y ) = (2 x + y ) x = 1,2; y = 1,2
18 4/18
6/18
2

∑ 18 (2 x + y) = 18 ∑ (2 x + y) = 18 (4 x + 3)
1 1 1
p X ( x) = x = 1,2 3/18
y y =1
5/18
2

∑ ∑
1 1 1
pY ( x) = (2 x + y ) = (2 x + y ) = (2 y + 6) y = 1,2
x
18 18 x =1
18

P[ X = x, Y = y ] p XY ( x, y ) 2 x + y
pY | X ( y | x) = = =
P[ X = x] p X ( x) 4x + 3
Multiple Random Variables

5.6 Conditional Distributions

 5.6.2 Conditional PDF for Continuous Random Variables

◈ Conditional PDF

f XY ( x, y ) f XY ( x, y )
f Y | X ( y | x) = f X |Y ( x | y ) =
f X ( x) fY ( y)
◈ Example 5.9)
 xe − x ( y +1) , 0 ≤ x ≤ ∞; 0 ≤ y ≤ ∞
f XY ( x, y ) = 
 0, otherwise

∞ ∞  e − xy 
∫ ∫
−x − xy −x −x
f X ( x) = f X ,Y ( x, y )dy = xe e dy = xe −  =e
−∞ 0
 x  0
∞ ∞ e − x ( y +1)
f Y (yx) = ∫−∞
f X ,Y ( x, y )dx = ∫0
xe − x ( y +1) dx dv = e − x ( y +1) dx, v = −
y +1
부분적분 필요 u=x

∞ ∞
xe − x ( y +1) ∞ e − x ( y +1) e − x ( y +1) ∞
∫ ∫
1
E[ X ] = [uv]∞0 − v du = [− ]0 + dx = 0 + [− ] =
2 0
y +1 y +1 ( y + 1) ( y + 1) 2
0 0

f XY ( x, y ) xe − x ( y +1) f XY ( x, y )
f Y | X ( y | x) = = −x
= e − xy f X |Y ( x | y ) = = x( y + 1) 2 e − x ( y +1)
f X ( x) e fY ( y)
Multiple Random Variables

5.6 Conditional Distributions

 5.6.3 Conditional Means and Variances

◈ Conditional mean : µY | X = E[Y | X = x]

◈ Conditional variance : σ Y2| X = E[(Y − µY | X ) 2 | X = x] = E[Y 2 | X = x] − (E[Y | X = x])2

◈ Example 5.10) : conditional mean E[X:Y=y]를 구하시오.


 e −( x / y ) e − y
 , 0 ≤ x ≤ ∞; 0 ≤ y ≤ ∞
f XY ( x, y ) =  y
 0,
 otherwise

∞ ∞ e−x / ye− y e− y ∞
f Y ( x) = ∫
−∞
f X ,Y ( x, y )dx = ∫0 y
dx =
y ∫ 0
e − x / y dx =e − y

f XY ( x, y ) e − ( x / y ) e − y e − ( x / y )
f X |Y ( x | y ) = = =
fY ( y) ye − y y

∞ ∞
∫ ∫
1
E[ X | Y = y ] = xf X |Y ( x | y )dx = xe − x / y dx
0 y 0

부분적분 필요 u = x, dv = e − x / y dx → du = dx, v = − ye − x / y

1
[
 − xye
y
−x / y
]∞
0 +y ∫ 0
∞ 
e − x / y dx  = y
→ y
Multiple Random Variables

5.6 Conditional Distributions

 5.6.4 Simple Rule for Independence

◈ X와 Y가 독립인 경우
 X와 Y의 Joint PDF가 직각좌표계에서 다음과 같은 형태를 가질 경우 독립이다.
f XY ( x, y ) = constant × x − factor × y − factor , a ≤ x ≤ b, c ≤ y ≤ d

◈ Example 5.12) 다음 Joint PDF는 독립인가?


1 3
f XY ( x, y ) = x y, 0 ≤ x ≤ 2, 0 ≤ y ≤ 1
2
1
f XY ( x, y ) = f X ( x) × f Y ( y ) = x 3 y, 0 ≤ x ≤ 2, 0 ≤ y ≤ 1
2
Multiple Random Variables

5.7 Covariance and Correlation Coefficient

 5.7 Covariance and Correlation Coefficient

◈ 두 개의 R.V. X와 Y의 covariance의 정의
Cor ( X , Y ) = σ XY = E[( X − µ X )(Y − µY )]
= E[ XY − µY X − µ X Y + µ X µY ]
= E[ XY ] − µ X µY X와 Y가 독립이면, Covariance는 zero가 됨

◈ Correlation coefficient
 a measure of how good a prediction of the value of one of the two random variables cab be formed
based on an observed value of the other.
Cor ( X , Y ) σ XY
ρ XY = =
Var ( X )Var (Y ) σ XσY -1에서 1까지의 값을 가짐

 correlation이 +와 –의 의미는?

◈ Example 5.13) 독립인지 확인 필요

25e −5 y , 0 ≤ x ≤ 0.2; 0 ≤ y ≤ ∞
f XY ( x, y ) = 
 0, otherwise
Multiple Random Variables

5.7 Covariance and Correlation Coefficient

 5.7 Covariance and Correlation Coefficient

◈ Example 5.14) 5분 기다리다 감


- Hans’s arrival time vs. Ann’s arrival time
f X ( x) = 1 / 60, 0 ≤ x ≤ 60 f Y ( y ) = 1 / 30, 15 ≤ x ≤ 45

- Probability that they meet - Probability that Ann arrives before Hans
 P[ X − Y ≤ 5]  P[Y ≤ X ]

Ann
Y
X
Y=
45
A: Hans가 10분 이전에 도착하는 경우
5

D: Hans가 50분 이후에 도착하는 경우


=
X
-

B : Hans가 10~40분 사이에 도착하고


Y

30 A B E C D
Ann이 5분 후에 도착하는 경우
5
=
Y

C : Hans가 20~50분 사이에 도착하고


-
X

15 Ann이 5분 전에 도착하는 경우

30
5
0 X Hans
-5 5 10 20 50 60

You might also like