0% found this document useful (0 votes)
67 views

Notes 12

This document summarizes key concepts from a lecture on probability and randomized algorithms: 1) A probability space consists of a set of points each with an assigned probability such that the total probability is 1. Experiments select a point and events are subsets of points. The probability of an event is the number of points in the event divided by the total number of points. 2) Conditional probability is the probability of one event given another. Two events are independent if knowing one event does not change the probability of the other. Complement events are the opposite of stated events. 3) Randomized algorithms use randomness to find approximate solutions faster than deterministic algorithms. They trade accuracy for speed. Expected values can be used to analyze

Uploaded by

savio77
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views

Notes 12

This document summarizes key concepts from a lecture on probability and randomized algorithms: 1) A probability space consists of a set of points each with an assigned probability such that the total probability is 1. Experiments select a point and events are subsets of points. The probability of an event is the number of points in the event divided by the total number of points. 2) Conditional probability is the probability of one event given another. Two events are independent if knowing one event does not change the probability of the other. Complement events are the opposite of stated events. 3) Randomized algorithms use randomness to find approximate solutions faster than deterministic algorithms. They trade accuracy for speed. Expected values can be used to analyze

Uploaded by

savio77
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

CS109A Notes for Lecture 2/23/96

Probability Space

Set of points, each with an attached probability


(nonnegative, real number), such that the sum of
the probabilities is 1.
We simplify in two ways:
a) Number of points is nite, .
b) Probability of each point is the same,
1 .
n

=n

Experiments

An experiment is the selection of a point in a probability space.


Under our \equally likely" assumption, all
points have the same chance of being chosen.

Events

An event is any set of points in a probability space.


PROB( ), the probability of an event is the
fraction of the points in .
Example: The event 4 = a 4 is dealt.
PROB( 4 ) = 4 52 = 1 13.
The event ~ = a heart is dealt. PROB( ~ ) =
13 52 = 1 4.
Generally, computing the probability of an
event involves two counts: the entire probability space and the number of points in .
E

Conditional Probability

Given that the outcome of an experiment is known


to be in some event , what is the probability that
the outcome is also in some other event ?
Known as the conditional probability of
given , or PROB( ).
= the fraction of the points in that are also
in .
E

F =E

Example:

= \a number card is selected" = 36 points


corresponding to ranks 2{10.
= \the card is less than 7" = 24 points
corresponding to ranks 2{6.
Of the 36 points in , 20 are also in (those
for ranks 2{6).
Thus, PROB( ) = 20 36 = 5 9.
E

F =E

Independent Events

is independent of if PROB( ) = PROB( ).


Intuitively, does not depend on whether
occurs.
Example: (Cards) = \suit is hearts." =
\rank is 5."
PROB(
) = 1 13 while PROB( ) =
4 52 = 1 13.

F =E

F =E

Independence Goes Both Ways

If is independent of then is independent of


. Consider:
F

n1

n2

n3

n4

+
3+ 4
1+ 2+
Swap left-denominator with right-numerator | preserves truth of the equality.
3
= + 3+ 4+
2+ 3
1
2+ 3
4
Left is PROB( ); right is PROB( ) |
shows independent of .
Example: Continuing cards example above,
PROB(
) = 1 4; PROB( ) = 13 52 = 1 4.
PROB(F =E ) =

n3

= PROB( ) =

E =F

E =F

n2

n3

n3

n4

Complement Events

If is an event, is the event \ does not occur."


PROB( ) = 1 PROB( ).
If is independent of , then is independent of .
E

Expected Value

is some function of points in a probability


space.
EV( ) = average over points of ( ).
Example: Space = cards. = Blackjack value
(Ace = 11; pictures = 10).
( ) = (4 11 + 4 2 + 4 3 + + 4 9 +
16 10) 52 = 7 31.
f

f p

ev f

Randomized Algorithms

Instead of an exact answer by a slow algorithm,


we may be able to get a \close" answer fast by
one that takes random \guesses."
Example: Video data compression uses a technique (MPEG) involving matching sections of one
frame to sections of the previous frame.
e.g., pieces of a moving car match themselves
displaced somewhat from previous frame.
In our gross simpli cation, imagine that \sections" are
squares of black-or-white
pixels, and we only ask whether this square
matches exactly any of the
squares of
the previous frame.
To test exactly requires comparing 2 pixels,
an ( 2 ) process.
n

O n

A Randomized Algorithm for Matching

Suppose we pick corresponding pixels in the


squares to compare at random. What is the expected number of comparisons needed before we
discover that two unrelated squares are di erent.
3

Assumption: if squares are unrelated, probability is 1/2 that any corresponding pixels
match.
Dubious: think of a cloudless sky with a
small plane.
If so, half the time we need only one comparison, half of the remaining half we need two,
and so on.
Let = number of comparisons made. EV( )
= P1 (1 2) + P1 (1 4) + 3 (1 8) +
1
2
= =1 2 = =0 2 (tricky \triangular"
argument explained in class) = 2.
Note formula for expected value involves
computing fraction of points in probability space leading to each value of ; e.g.,
half have ( ) = 1.
Thus, average running time is (1), vs. ( 2 )
for the exact algorithm.
f

f p

O n

Even More Randomness

We can devise a \Monte-Carlo" algorithm that


doesn't always give the correct answer (it may say
\they match" when they don't), but takes (1)
time rather than ( 2 ) even for correct matches.
Key idea: Make no more than 20 tests. If
they all succeed, say the squares match.
Probability of all 20 matching even though
the squares are unrelated = 2 20 or about
one in a million.
O

O n

You might also like