100% found this document useful (1 vote)
65 views

Independence and Bernoulli Trials (Euler, Ramanujan and Bernoulli Numbers)

The document discusses independence of events and provides examples of applying the concept of independence to probability calculations for repeated experiments and Bernoulli trials. It defines independent events and families of independent events. Examples are provided to calculate the probability of relatively prime numbers, square-free numbers, and the probability of receiving an input signal from parallel switches operating independently.

Uploaded by

hari_shankar55
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
65 views

Independence and Bernoulli Trials (Euler, Ramanujan and Bernoulli Numbers)

The document discusses independence of events and provides examples of applying the concept of independence to probability calculations for repeated experiments and Bernoulli trials. It defines independent events and families of independent events. Examples are provided to calculate the probability of relatively prime numbers, square-free numbers, and the probability of receiving an input signal from parallel switches operating independently.

Uploaded by

hari_shankar55
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 40

1

Independence: Events A and B are independent if



It is easy to show that A, B independent implies
are all independent pairs. For example,
and so that
or
i.e., and B are independent events.
). ( ) ( ) ( B P A P AB P =
(2-1)
; , B A
B A B A , ; ,
B A AB B A A B = = ) ( , | = B A AB
), ( ) ( ) ( )) ( 1 ( ) ( ) ( ) ( ) ( B P A P B P A P B P A P B P B A P = = =
) ( ) ( ) ( ) ( ) ( ) ( ) ( B A P B P A P B A P AB P B A AB P B P + = + = =
A
PILLAI
2. Independence and Bernoulli Trials
(Euler, Ramanujan and Bernoulli Numbers)
2
As an application, let A
p
and A
q
represent the events


and


Then from (1-4)


Also




Hence it follows that A
p
and A
q
are independent events!
"the prime divides the number "
p
A p N =
"the prime divides the number ".
q
A q N =
1 1
{ } , { }
p q
P A P A
p q
= =
1
{ } {" divides "} { } { }
p q p q
P A A P pq N P A P A
pq
= = =
(2-2)
PILLAI
3
If P(A) = 0, then since the event always, we have

and (2-1) is always satisfied. Thus the event of zero
probability is independent of every other event!
Independent events obviously cannot be mutually
exclusive, since and A, B independent
implies Thus if A and B are independent,
the event AB cannot be the null set.
More generally, a family of events are said to be
independent, if for every finite sub collection
we have
A ABc
, 0 ) ( 0 ) ( ) ( = = s AB P A P AB P
0 ) ( , 0 ) ( > > B P A P
. 0 ) ( > AB P
, , , ,
2 1 n
i i i
A A A
[
= =
=
|
|
.
|

\
|
n
k
i
n
k
i
k k
A P A P
1 1
). (

(2-3)
{ }
i
A
PILLAI
4
Let

a union of n independent events. Then by De-Morgans
law

and using their independence

Thus for any A as in (2-4)

a useful result.
We can use these results to solve an interesting number
theory problem.
,
3 2 1 n
A A A A A =
(2-4)
n A A A A 2 1 =
. )) ( 1 ( ) ( ) ( ) (
1 1
2 1
[ [
= =
= = =
n
i
i
n
i
i n
A P A P A A A P A P
(2-5)
, )) ( 1 ( 1 ) ( 1 ) (
1
[
=
= =
n
i
i
A P A P A P
(2-6)
PILLAI
5
Example 2.1 Two integers M and N are chosen at random.
What is the probability that they are relatively prime to
each other?
Solution: Since M and N are chosen at random, whether
p divides M or not does not depend on the other number N.
Thus we have



where we have used (1-4). Also from (1-10)




Observe that M and N are relatively prime if and only if
there exists no prime p that divides both M and N.
2
{" divides both and "}
1
{" divides "} {" divides "}
P p M N
P p M P p N
p
= =
2
{" does divede both and "}
1
1 {" divides both and "} 1
P p not M N
P p M N
p
= =
PILLAI
6
Hence


where X
p
represents the event


Hence using (2-2) and (2-5)






where we have used the Eulers identity
1

1
See Appendix for a proof of Eulers identity by Ramanujan.
2 3 5
" and are relatively prime" M N X X X =
" divides both and ".
p
X p M N =
2
prime
2 2
2
prime
1
1
{" and N are relatively prime"} ( )
1 1 6
(1 ) 0.6079,
/ 6
1/
p
p
p
k
p
P M P X
k
t t

=
=
= = = = =
[
[

PILLAI
7


The same argument can be used to compute the probability
that an integer chosen at random is square free.
Since the event



using (2-5) we have
1
1 prime
1
1/ (1 ) .
s
s
k p
p
k

=
=
[
2
prime
"An integer chosen at random is square free"
{" does divide "},
p
p not N =
2
2
prime prime
2 2
2
1
1
{"An integer chosen at random is square free"}
{ does divide } (1 )
1 1 6
.
/ 6
1/
p p
k
p
P
P p not N
k
t t

=
= =
= = =
[ [

PILLAI
8
Note: To add an interesting twist to the square free number
problem, Ramanujan has shown through elementary but
clever arguments that the inverses of the n
th
powers of all
square free numbers add to where (see (2-E))



Thus the sum of the inverses of the squares of square free
numbers is given by



2
/ ,
n n
S S
1
1/ .
n
n
k
S k

=
=

2
2 2 2 2 2 2 2 2 2
4
2
4 2
1 1 1 1 1 1 1 1 1
2 3 5 6 7 10 11 13 14
/ 6 15
1.51198.
/ 90
S
S
t
t t
+ + + + + + + + + =
= = =
PILLAI
9
Input Output
, ) ( p A P
i
=
). ( ) ( ) ( ) ( ); ( ) ( ) (
3 2 1 3 2 1
A P A P A P A A A P A P A P A A P
j i j i
= =
. 3 1 = i
Example 2.2: Three switches connected in parallel operate
independently. Each switch remains closed with probability
p. (a) Find the probability of receiving an input signal at the
output. (b) Find the probability that switch S
1
is open given
that an input signal is received at the output.



Solution: a. Let A
i
= Switch S
i
is closed. Then
Since switches operate independently, we have
Fig.2.1
PILLAI
1
s
2
s
3
s
10
Let R = input signal is received at the output. For the
event R to occur either switch 1 or switch 2 or switch 3
must remain closed, i.e.,
(2-7)
(2-8)
(2-9)
.
3 2 1
A A A R =
. 3 3 ) 1 ( 1 ) ( ) (
3 2 3
3 2 1
p p p p A A A P R P + = = =
). ( ) | ( ) ( ) | ( ) (
1 1
1 1
A P A R P A P A R P R P + =
, 1 ) | (
1
= A R P
2
3 2
1
2 ) ( ) | ( p p A A P A R P = =
Using (2-3) - (2-6),
We can also derive (2-8) in a different manner. Since any
event and its compliment form a trivial partition, we can
always write
But and
and using these in (2-9) we obtain
, 3 3 ) 1 )( 2 ( ) (
3 2 2
p p p p p p p R P + = + =
(2-10)
which agrees with (2-8).
PILLAI
11
Note that the events A
1
, A
2
, A
3
do not form a partition, since
they are not mutually exclusive. Obviously any two or all
three switches can be closed (or open) simultaneously.
Moreover,
b. We need From Bayes theorem


Because of the symmetry of the switches, we also have
. 1 ) ( ) ( ) (
3 2 1
= + + A P A P A P
). | ( 1 R A P
.
3 3
2 2
3 3
) 1 )( 2 (
) (
) ( ) | (
) | (
3 2
2
3 2
2
1 1
1
p p p
p p
p p p
p p p
R P
A P A R P
R A P
+
+
=
+

= =
). | ( ) | ( ) | ( 3 2 1 R A P R A P R A P = =
(2-11)
PILLAI
12
Repeated Trials
Consider two independent experiments with associated
probability models (O
1
, F
1
, P
1
) and (O
2
, F
2
, P
2
). Let
eO
1
, qeO
2
represent elementary events. A joint
performance of the two experiments produces an
elementary events e = (, q). How to characterize an
appropriate probability to this combined event ?
Towards this, consider the Cartesian product space
O = O
1
O
2
generated from O
1
and O
2
such that if
e O
1
and q e O
2
, then every e in O is an ordered pair
of the form e = (, q). To arrive at a probability model
we need to define the combined trio (O, F, P).
PILLAI
13
Suppose AeF
1
and B e F
2
. Then A B is the set of all pairs
(, q), where e A and q e B. Any such subset of O
appears to be a legitimate event for the combined
experiment. Let F denote the field composed of all such
subsets A B together with their unions and compliments.
In this combined experiment, the probabilities of the events
A O
2
and O
1
B are such that

Moreover, the events A O
2
and O
1
B are independent for
any A e F
1
and B e F
2
. Since

we conclude using (2-12) that
). ( ) ( ), ( ) (
2 1 1 2
B P B P A P A P = O = O (2-12)
, ) ( ) (
1 2
B A B A = O O
(2-13)
PILLAI
14
) ( ) ( ) ( ) ( ) (
2 1 1 2
B P A P B P A P B A P = O O =
for all A e F
1
and B e F
2
. The assignment in (2-14) extends
to a unique probability measure on the sets in F
and defines the combined trio (O, F, P).
Generalization: Given n experiments and
their associated let

represent their Cartesian product whose elementary events
are the ordered n-tuples where Events
in this combined space are of the form

where and their unions an intersections.
(2-14)
) (
2 1
P P P
, , , ,
2 1 n
O O O
, 1 , and n i P F
i i
=
, , , ,
2 1 n
.
i i
O e
n
A A A
2 1
n
O O O = O
2 1
(2-15)
(2-16)
,
i i
F A e
PILLAI
15
If all these n experiments are independent, and is the
probability of the event in then as before

Example 2.3: An event A has probability p of occurring in a
single trial. Find the probability that A occurs exactly k times,
k s n in n trials.
Solution: Let (O, F, P) be the probability model for a single
trial. The outcome of n experiments is an n-tuple

where every and as in (2-15).
The event A occurs at trial # i , if Suppose A occurs
exactly k times in e.
) (
i i
A P
i
A
i
F
1 2 1 1 2 2
( ) ( ) ( ) ( ).
n n n
P A A A P A P A P A =
(2-17)
{ } , , , ,
0 2 1
O e =
n
e (2-18)
O e
i

O O O = O
0
. A
i
e
PILLAI
16
Then k of the belong to A, say and the
remaining are contained in its compliment in
Using (2-17), the probability of occurrence of such an e is
given by


However the k occurrences of A can occur in any particular
location inside e. Let represent all such
events in which A occurs exactly k times. Then

But, all these s are mutually exclusive, and equiprobable.
i
, , , ,
2 1 k
i i i

k n . A
. ) ( ) ( ) ( ) ( ) ( ) (
}) ({ }) ({ }) ({ }) ({ }) , , , , , ({ ) (
2 1 2 1
0
k n k
k n k
i i i i i i i i
q p A P A P A P A P A P A P
P P P P P P
n k n k

= =
= =




e
(2-19)
N
e e e , , ,
2 1

. trials" in times exactly occurs "
2 1 N
n k A e e e = (2-20)
i
e
PILLAI
17
Thus


where we have used (2-19). Recall that, starting with n
possible choices, the first object can be chosen n different
ways, and for every such choice the second one in
ways, and the kth one ways, and this gives the
total choices for k objects out of n to be
But, this includes the choices among the k objects that
are indistinguishable for identical objects. As a result

, ) ( ) (
) trials" in times exactly occurs ("
0
1
0
k n k
N
i
i
q Np NP P
n k A P

=
= = =

e e
(2-21)
(2-22)
) 1 ( n
) 1 ( + k n
). 1 ( ) 1 ( + k n n n
! k
|
|
.
|

\
|
=

=
+
=
k
n
k k n
n
k
k n n n
N
! )! (
!
!
) 1 ( ) 1 ( A
PILLAI
18
, , , 2 , 1 , 0 ,
) trials" in times exactly occurs (" ) (
n k q p
k
n
n k A P k P
k n k
n
=
|
|
.
|

\
|
=
=

(2-23)
) ( A =
) ( A =
represents the number of combinations, or choices of n
identical objects taken k at a time. Using (2-22) in (2-21),
we get


a formula, due to Bernoulli.
Independent repeated experiments of this nature, where the
outcome is either a success or a failure
are characterized as Bernoulli trials, and the probability of
k successes in n trials is given by (2-23), where p
represents the probability of success in any one trial.
PILLAI
19
Example 2.4: Toss a coin n times. Obtain the probability of
getting k heads in n trials ?
Solution: We may identify head with success (A) and
let In that case (2-23) gives the desired
probability.
Example 2.5: Consider rolling a fair die eight times. Find
the probability that either 3 or 4 shows up five times ?
Solution: In this case we can identify

Thus

and the desired probability is given by (2-23) with
and Notice that this is similar to a biased coin
problem.
). (H P p =
{ } { }. } 4 or 3 either { success" "
4 3
f f A = = =
,
3
1
6
1
6
1
) ( ) ( ) (
4 3
= + = + = f P f P A P
5 , 8 = = k n
. 3 / 1 = p
PILLAI
20
Bernoulli trial: consists of repeated independent and
identical experiments each of which has only two outcomes A
or with and The probability of exactly
k occurrences of A in n such trials is given by (2-23).
Let

Since the number of occurrences of A in n trials must be an
integer either must
occur in such an experiment. Thus

But are mutually exclusive. Thus
A
. ) ( q A P = , ) ( p A P =
, , , 2 , 1 , 0 n k =
. trials" in s occurrence exactly " n k X
k
= (2-24)
n
X X X X or or or or
2 1 0

. 1 ) (
1 0
=
n
X X X P (2-25)
j i
X X ,
PILLAI
21
From the relation

(2-26) equals and it agrees with (2-25).
For a given n and p what is the most likely value of k ?
From Fig.2.2, the most probable value of k is that number
which maximizes in (2-23). To obtain this value,
consider the ratio

= =

|
|
.
|

\
|
= =
n
k
n
k
k n k
k n
q p
k
n
X P X X X P
0 0
1 0
. ) ( ) (
(2-26)
, ) (
0
k n k
n
k
n
b a
k
n
b a

=

|
|
.
|

\
|
= + (2-27)
, 1 ) ( = +
n
q p
) (k P
n
. 2 / 1 , 12 = = p n
Fig. 2.2
) (k P
n
k
PILLAI
22
Thus if or
Thus as a function of k increases until

if it is an integer, or the largest integer less than
and (2-29) represents the most likely number of successes
(or heads) in n trials.
Example 2.6: In a Bernoulli experiment with n trials, find
the probability that the number of occurrences of A is
between and
.
1 !
! )! (
)! 1 ( )! 1 (
!
) (
) 1 (
1 1
p
q
k n
k
q p n
k k n
k k n
q p n
k P
k P
k n k
k n k
n
n
+
=

+
=

+
), 1 ( ) ( > k P k P
n n
p k n p k ) 1 ( ) 1 ( + s . ) 1 ( p n k + s
) (k P
n
p n k ) 1 ( + =
, ) 1 ( p n +
(2-28)
(2-29)
1
k .
2
k
max
k
PILLAI
23
Solution: With as defined in (2-24),
clearly they are mutually exclusive events. Thus



Example 2.7: Suppose 5,000 components are ordered. The
probability that a part is defective equals 0.1. What is the
probability that the total number of defective parts does not
exceed 400 ?
Solution: Let
, , , 2 , 1 , 0 , n i X
i
=
. ) ( ) (
) " and between is of s Occurrence ("
2
1
2
1
2 1 1
1
2 1

=

=
+
|
|
.
|

\
|
= = =
k
k k
k n k
k
k k
k k k k
q p
k
n
X P X X X P
k k A P

(2-30)
". components 5,000 among defective are parts " k Y
k
=
PILLAI
24
Using (2-30), the desired probability is given by



Equation (2-31) has too many terms to compute. Clearly,
we need a technique to compute the above term in a more
efficient manner.
From (2-29), the most likely number of successes in n
trials, satisfy

or
. ) 9 . 0 ( ) 1 . 0 (
5000

) ( ) (
5000
400
0
400
0
400 1 0
k k
k
k
k
k
Y P Y Y Y P

=
=

|
|
.
|

\
|
=
=
(2-31)
max
k
p n k p n ) 1 ( 1 ) 1 (
max
+ s s +
(2-32)
,
max
n
p
p
n
k
n
q
p + s s
(2-33)
PILLAI
25
so that

From (2-34), as the ratio of the most probable
number of successes (A) to the total number of trials in a
Bernoulli experiment tends to p, the probability of
occurrence of A in a single trial. Notice that (2-34) connects
the results of an actual experiment ( ) to the axiomatic
definition of p. In this context, it is possible to obtain a more
general result as follows:
Bernoullis theorem: Let A denote an event whose
probability of occurrence in a single trial is p. If k denotes
the number of occurrences of A in n independent trials, then
. lim p
n
k
m
n
=

(2-34)
, n
.
2
c
c
n
pq
p
n
k
P <
|
|
.
|

\
|
)
`

>
(2-35)
n k
m
/
PILLAI
26
Equation (2-35) states that the frequency definition of
probability of an event and its axiomatic definition ( p)
can be made compatible to any degree of accuracy.
Proof: To prove Bernoullis theorem, we need two identities.
Note that with as in (2-23), direct computation gives



Proceeding in a similar manner, it can be shown that
n
k
) (k P
n
. ) (
! )! 1 (
)! 1 (
! )! 1 (
!

)! 1 ( )! (
!
! )! (
!
) (
1
1
1
0
1 1
1
0
1
1
1 0
np q p np
q p
i i n
n
np q p
i i n
n
q p
k k n
n
q p
k k n
n
k k P k
n
i n i
n
i
i n i
n
i
k n k
n
k
n
k
k n k
n
k
n
= + =


=

=

=

=
+

=


(2-36)
.
)! 1 ( )! (
!

)! 2 ( )! (
!
)! 1 ( )! (
!
) (
2 2
1
2 1 0
2
npq p n q p
k k n
n
q p
k k n
n
q p
k k n
n
k k P k
k n k
n
k
k n k
n
k
n
k
k n k
n
k
n
+ =

+

=

=

= =


(2-37)
PILLAI
27
Returning to (2-35), note that

which in turn is equivalent to

Using (2-36)-(2-37), the left side of (2-39) can be expanded
to give


Alternatively, the left side of (2-39) can be expressed as
, ) ( to equivalent is
2 2 2
c c n np k p
n
k
> >
(2-38)
. ) ( ) ( ) (
2 2
0
2 2
0
2
c c n k P n k P np k
n
n
k
n
n
k
= >

= =
(2-39)
. 2
) ( 2 ) ( ) ( ) (
2 2 2 2
2 2
0 0
2
0
2
npq p n np np npq p n
p n k P k np k P k k P np k
n
n
k
n
n
k
n
n
k
= + + =
+ =

= = =
(2-40)
{ }.
) ( ) ( ) (
) ( ) ( ) ( ) ( ) ( ) (
2 2
2 2 2
2 2
0
2
c c
c
c c
c c
n np k P n
k P n k P np k
k P np k k P np k k P np k
n
n np k
n
n np k
n
n np k
n
n np k
n
n
k
> =
> >
+ =


> >
> s =
(2-41)
PILLAI
28
Using (2-40) in (2-41), we get the desired result

Note that for a given can be made arbitrarily
small by letting n become large. Thus for very large n, we
can make the fractional occurrence (relative frequency)
of the event A as close to the actual probability p of the
event A in a single trial. Thus the theorem states that the
probability of event A from the axiomatic framework can be
computed from the relative frequency definition quite
accurately, provided the number of experiments are large
enough. Since is the most likely value of k in n trials,
from the above discussion, as the plots of tends
to concentrate more and more around in (2-32).
.
2
c
c
n
pq
p
n
k
P <
|
|
.
|

\
|
)
`

>
(2-42)
2
/ , 0 c c n pq >
n
k
max
k
, n
) (k P
n
max
k
PILLAI
29
Next we present an example that illustrates the usefulness of
simple textbook examples to practical problems of interest:
Example 2.8 : Day-trading strategy : A box contains n
randomly numbered balls (not 1 through n but arbitrary
numbers including numbers greater than n). Suppose
a fraction of those balls are initially
drawn one by one with replacement while noting the numbers
on those balls. The drawing is allowed to continue until
a ball is drawn with a number larger than the first m numbers.
Determine the fraction p to be initially drawn, so as to
maximize the probability of drawing the largest among the
n numbers using this strategy.

Solution: Let drawn ball has the largest
number among all n balls, and the largest among the
say ; 1 m np p = <
st
k
k X ) 1 ( + =
PILLAI
30
first k balls is in the group of first m balls, k > m. (2.43)
Note that is of the form
where
A = largest among the first k balls is in the group of first
m balls drawn
and
B = (k+1)
st
ball has the largest number among all n balls.
Notice that A and B are independent events, and hence


Where m = np represents the fraction of balls to be initially
drawn. This gives
P (selected ball has the largest number among all balls)
k
X , A B
.
1

1
) ( ) ( ) (
k
p
k
np
n k
m
n
B P A P X P
k
= = = =
(2-44)
1 1


1 1
( ) ln
ln .
n n
n
n
k
np
np
k m k m
P X p p p k
k k
p p

= =
= = ~ =
=

}
(2-45)
31
Maximization of the desired probability in (2-45) with
respect to p gives


or

From (2-45), the maximum value for the desired probability
of drawing the largest number equals 0.3679 also.
Interestingly the above strategy can be used to play the
stock market.
Suppose one gets into the market and decides to stay
up to 100 days. The stock values fluctuate day by day, and
the important question is when to get out?

According to the above strategy, one should get out
0 ) ln 1 ( ) ln ( = + = p p p
dp
d
1
0.3679. p e

= (2-46)
PILLAI
32
at the first opportunity after 37 days, when the stock value
exceeds the maximum among the first 37 days. In that case
the probability of hitting the top value over 100 days for the
stock is also about 37%. Of course, the above argument
assumes that the stock values over the period of interest are
randomly fluctuating without exhibiting any other trend.
Interestingly, such is the case if we consider shorter time
frames such as inter-day trading.
In summary if one must day-trade, then a possible strategy
might be to get in at 9.30 AM, and get out any time after
12 noon (9.30 AM + 0.3679 6.5 hrs = 11.54 AM to be
precise) at the first peak that exceeds the peak value between
9.30 AM and 12 noon. In that case chances are about 37%
that one hits the absolute top value for that day! (disclaimer :
Trade at your own risk)
PILLAI

33
PILLAI
We conclude this lecture with a variation of the Game of
craps discussed in Example 3-16, Text.

Example 2.9: Game of craps using biased dice:
From Example 3.16, Text, the probability of
winning the game of craps is 0.492929 for the player.
Thus the game is slightly advantageous to the house. This
conclusion of course assumes that the two dice in question
are perfect cubes. Suppose that is not the case.

Let us assume that the two dice are slightly loaded in such
a manner so that the faces 1, 2 and 3 appear with probability
and faces 4, 5 and 6 appear with probability
for each dice. If T represents the combined
total for the two dice (following Text notation), we get
1
6
0 , c c + >
1
6
c
34
PILLAI
2
4
2 2
5
2 2
6
7
1
6
1 1
36 6
1 1
36 6
{ 4} {(1, 3), (2, 2), (1, 3)} 3( )
{ 5} {(1, 4), (2, 3), (3, 2), (4,1)} 2( ) 2( )
{ 6} {(1, 5), (2, 4), (3, 3), (4, 2), (5,1)} 4( ) ( )
{ 7} {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6,1
p P T P
p P T P
p P T P
p P T P
c
c c
c c
= = = =
= = = = +
= = = = +
= = =
2
2 2
8
2 2
9
2
10
11
1
36
1 1
36 6
1 1
36 6
1
6
1
6
)} 6( )
{ 8} {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} 4( ) ( )
{ 9} {(3, 6), (4, 5), (5, 4), (6, 3)} 2( ) 2( )
{ 10} {(4, 6), (5, 5), (6, 4)} 3( )
{ 11} {(5, 6), (6, 5)} 2(
p P T P
p P T P
p P T P
p P T P
c
c c
c c
c
=
= = = = + +
= = = = + +
= = = = +
= = = = +
2
) . c
(Note that (1,3) above represents the event the first dice
shows face 1, and the second dice shows face 3 etc.)
For we get the following Table:
0.01, c =
35
PILLAI
T = k 4 5 6 7 8 9 10 11
p
k
= P{T = k} 0.0706 0.1044 0.1353 0.1661 0.1419 0.1178 0.0936 0.0624
This gives the probability of win on the first throw to be
(use (3-56), Text)

and the probability of win by throwing a carry-over to be
(use (3-58)-(3-59), Text)



Thus

Although perfect dice gives rise to an unfavorable game,
1
( 7) ( 11) 0.2285 P P T P T = = + = =
(2-47)

2
10
2
4
7
7
0.2717
k
k
k
k
p
p p
P
=
=
+
= =

(2-48)
1 2
{winning the game} 0.5002 P P P = + =
(2-49)
36
PILLAI
a slight loading of the dice turns the fortunes around in
favor of the player! (Not an exciting conclusion as far as
the casinos are concerned).
Even if we let the two dice to have different loading
factors and (for the situation described above), similar
conclusions do follow. For example,
gives (show this)

Once again the game is in favor of the player!
Although the advantage is very modest in each play, from
Bernoullis theorem the cumulative effect can be quite
significant when a large number of game are played.
All the more reason for the casinos to keep the dice in
perfect shape.
1
c
2
c
1 2
0.01 and 0.005 c c = =
{winning the game} 0.5015. P =
(2-50)
37
In summary, small chance variations in each game
of craps can lead to significant counter-intuitive changes
when a large number of games are played. What appears
to be a favorable game for the house may indeed become
an unfavorable game, and when played repeatedly can lead
to unpleasant outcomes.
PILLAI
38
Appendix: Eulers Identity

S. Ramanujan in one of his early papers (J. of Indian
Math Soc; V, 1913) starts with the clever observation that
if are numbers less than unity where
the subscripts are the series of prime
numbers, then
1




Notice that the terms in (2-A) are arranged in such a way
that the product obtained by multiplying the subscripts are
the series of all natural numbers Clearly,
(2-A) follows by observing that the natural numbers
1
The relation (2-A) is ancient.
2 3 5 7 11
, , , , , a a a a a
2,3,5,7,11,
2,3, 4,5,6,7,8,9, .
2 3 2 2 5
2 3 5 7
2 3 7 2 2 2 3 3
1 1 1 1
1
1 1 1 1
.
a a a a a
a a a a
a a a a a a a a
= + + + +

+ + + + + (2-A)
PILLAI
39
are formed by multiplying primes and their powers.
Ramanujan uses (2-A) to derive a variety of
interesting identities including the Eulers identity that
follows by letting in
(2-A). This gives the Euler identity


The sum on the right side in (2-B) can be related to the
Bernoulli numbers (for s even).
Bernoulli numbers are positive rational numbers
defined through the power series expansion of the even
function Thus if we write


then
2 3 5
1/ 2 , 1/ 3 , 1/ 5 ,
s s s
a a a = = =
1
prime 1
1
(1 ) 1/ .
s
s
p n
p
n

=
=
[
(2-B)
2
cot( / 2).
x
x
PILLAI
1 2 3 4 5
1 1 1 1 1
6 30 42 30 66
, , , , , . B B B B B = = = = =
2 4 6
1 2 3
cot( / 2) 1
2 2! 4! 6!
x x x x
x B B B =
A
(2-C)
40
By direct manipulation of (2-C) we also obtain


so that the Bernoulli numbers may be defined through
(2-D) as well. Further




which gives


Thus
1


1
The series can be summed using the Fourier series expansion of a periodic ramp
signal as well.
6 2 4
3 1 2
1
1 2 2! 4! 6!
x
B x x x B x B x
e
= + +

2 1
2 1 2 4
2
0 0
2 2 2 2 2
1
4 ( )
2(2 )! 1 1 1 1
(2 ) 1 2 3 4
n
n x x
x n
n n n n n
x
e
B n dx x e e dx
n
t t
t
t

= = + +
| |
= + + + +
|
\ .
} }
2 4
2 4
1 1
1/ ; 1/ etc.
6 90
k k
k k
t t

= =
= =

2
1
1/
k
k

PILLAI
(2-D)
2
2
2
1
(2 )
1/
2(2 )!
n
n
n
n
k
B
S k
n
t

=
= =

A
(2-E)

You might also like