Probability Distributions
Probability Distributions
UNDERSTANDING PROBABILITIES
EXPERIMENT The process that can lead to more than one outcome
RANDOM EXPERIMENT If an experiment, when repeated under identical conditions, do not produce the
same outcome everytime but the outcome in a trial is one of the several possible outcomes, then such an
experiment is called a random experiment or a probabilistic experiment.
ELEMENTARY EVENT If a random experiment is performed, then each of its outcomes is known as an
elementary event.
SAMPLE SPACE The set of all possible outcomes of a random experiment called the sample space
associated with it.
EVENT A subset of the sample space associated with a random experiment is called an event.
OCCURRENCE OF AN EVENT An event associated to a random experiment is said to occur if any one of
the elementary events belonging to it is an outcome.
Corresponding to every event A, associated to a random experiment, we define an event “not A denoted by Ā"
which is said to occur when and only when A does not occur.
UNDERSTANDING PROBABILITIES
CERTAIN (OR SURE EVENT): An event associated with a random experiment is called a certain event if it
always occurs whenever the experiment is performed.
IMPOSSIBLE EVENT An event associated with a random experiment is called an impossible event if it never
occurs whenever the experiment is performed.
COMPOUND EVENT An event associated with a random experiment is a compound event, if it is the disjoint
union of two or more elementary events.
MUTUALLY EXCLUSIVE EVENTS Two or more events associated with a random experiment are said to
be mutually exclusive or impossible events if the occurrence of any one of them prevents the occurrence of all
others, i.e. if no two or more of them can occur simultaneously in the same trial.
EXHAUSTIVE EVENTS Two or more events associated with a random experiment are exhaustive if their
union is the sample space.
FAVOURABLE ELEMENTARY EVENTS Let S be the sample space associated with a random experiment
and A be an event associated with the experiment. Then, elementary events belonging to A are known as
favourable elementary events to the event A.
Thus, an elementary event E is favourable to an event A, if the occurrence of E ensures the happening or occurrence of
event A.
RANDOM VARIABLE
Example 1:
For instance, having assigned the probability 1/36 to each element of the sample space of
Figure 1, we immediately find that the random variable X, the total rolled with the pair of
dice, takes on the value 9 with probability 4/36 ; as described in Section 1, X = 9 contains
four of the equally likely elements of the sample space.
Probability Distributions and Probability Densities associated with
Figure 1; Rolling two dies at the same time.
Rather than tabulating, it is usually preferable to give a formula, that
is, to express the probabilities by means of a function such that its
values, f (x), equal P(X = x) for each x within the range of the RV - X.
For instance, for the total rolled with a pair of dice we could write
Task:-
Find the distribution function of the random variable W of Example 1 and plot its graph.
Solution
Obtaining the values of Probability Distribution from its Distribution Function
Task:-
We considered only the random variable whose values were the totals rolled with a pair of dice
Closer to life, an experiment may consist of randomly choosing some of the 300 students attending an
elementary school
principal may be interested in their I.Q.’s,
the school nurse in their weights,
teachers in the number of days they have been absent, and so forth.
bivariate case
If X and Y are discrete random variables, we write the probability that X will take on the value x and Y will take
on the value y as P(X = x, Y = y). Thus, P(X = x, Y = y) is the probability of the intersection of the events X = x and
Y = y.
Actually, as in the univariate case, it is generally preferable to represent
probabilities such as these by means of a formula. In other words, it is
preferable to express the probabilities by means of a function with the
values f (x, y) = P(X = x,Y = y) for any pair of values (x, y) within the
range of the random variables X and Y. For instance, for the two
random variables of Example 12 we can write
To know the probability that the values of two RVs are less than or equal to some real numbers x and y.
Suppose now that A and B are the events X = x and Y = y so that we can write
When X and Y are continuous random variables, the probability distributions are replaced by probability densities,
SPECIAL PROBABILITY DISTRIBUTIONS
Coin tossed 3 times
Sample space = 2^3= 8 possible events
X is RV defined as getting head
P(X=2) = 3/8 ; {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}
Discrete Distribution:
Finite outcomes:
Rolling a die, Picking a card
Continuous Distribution:
Infinite outcomes:
Recording time, Measuring distance
Parameters:
Expected value doesn’t provide any relevant information.
Mean = 3.5 , and variance = 105/36
Uninterpretable,
Not useful for prediction
No real intuition
THE BERNOULLI DISTRIBUTION
Clearly, the number of ways in which we can select the x trials on which there is to be a success is nCx