0% found this document useful (0 votes)
105 views

Probability NST Notes

1. Probability is defined as the chances of occurrence of an event. Trials are experiments repeated under the same conditions, with possible outcomes called events. 2. Exhaustive events are all possible outcomes of a trial. Mutually exclusive events cannot both occur in the same trial. Favorable events ensure the occurrence of a specified event. Equally likely events have no preference for any outcome. 3. Independent events are unaffected by other events. Probability is calculated as favorable cases over total cases. A certain event has 100% probability, while an impossible event has 0% probability.

Uploaded by

Adarsh Shukla
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views

Probability NST Notes

1. Probability is defined as the chances of occurrence of an event. Trials are experiments repeated under the same conditions, with possible outcomes called events. 2. Exhaustive events are all possible outcomes of a trial. Mutually exclusive events cannot both occur in the same trial. Favorable events ensure the occurrence of a specified event. Equally likely events have no preference for any outcome. 3. Independent events are unaffected by other events. Probability is calculated as favorable cases over total cases. A certain event has 100% probability, while an impossible event has 0% probability.

Uploaded by

Adarsh Shukla
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

1.

Probability: The chances of occurrence of an event is known


as Probability.
2. Trials & Events: Let an experiment be repeated under the
same conditions and let it result in any one of the several
possible outcomes, then the experiment is called a trial and
the possible outcomes are known as events. For example:
Tossing of a coin is a trial & turning up of Head or Tail is an
event.
3. Exhaustive Events: The total number of all possible
outcomes in any trial is known as an exhaustive event. For
example: In tossing of a coin, there are two exhaustive events
either head or a tail.
4. Mutually Exclusive Events: Events are said to be mutually
exclusive or incompatible if the happening of any one of them
rules out the happening of another event. For example: In
tossing of a coin, the events head & tails are mutually
exclusive because if the outcome is head, the possibility of
getting a tail in the same trial is ruled out.

5. Favorable events: The cases which ensure the happening of


the event are said to be favorable events. It is the total number
of possible outcomes in which specified event will occur for
sure.

6. Equally Likely events: Events are said to be equally likely if


there is no reason to expect any one in the preference of any
other. For example: When a card is drawn from a well shuffled
pack of cards, any card may appear in draw such that any one
of 52 cards are equally likely.

7. Independent Events: Two or more events are said to be


independent if the happening or non-happening of any one is
not affected by the happening or non-happening of any other
event. For example: If a card is drawn from a pack of well
shuffled card and replaced before second card is drawn, then
the result of second draw is independent of first draw.

Note: 1. If a trial has ‘n’ mutually exclusive cases and ‘m’


of them are favorable, then the probability of happening
of event ‘E’ is given by the formula given below:
Favorable no. of cases = P (E) = m = p
Exhaustive no. of cases n

2. The no. of unfavorable cases is given by n-m.


3. The probability of happening an event will not
happen is given by
q = P (EC) = n-m = 1- m = 1-p.
n n
4. p + q cannot exceed 1 which implies p + q = 1.

5. If P (E) = 1, then it is called a certain event &


chances of occurrence of an event is 100%.
6. If P (E) = 0, it is called as an impossible event.

8. Random Variables: If earlier variable ‘X’ is associated with


the outcome of random experiment, then the value which ‘X’
takes depend on the chances and it is called Random Variable
or a Variate. For Example: Throwing of a dice. If the random
variable takes the finite set of values, it is called as discrete
variable. If the random variable assumes infinite no. of
uncountable values, it is called as continuous variable.
 Discrete Probability Distribution: Suppose a discrete
variate ‘x’ is the outcome of same experiment and if the
probability that ‘x’ takes values is ‘x-xi’ is Pi , then the
probability of occurrence of the event can be represented by
P(X-Xi) = P or P(xi) ; for i=1,2,3,….
Here P(xi) ≥ 0 for all values of i.
∑ P(xi) = 1.
 Cumulative Distribution Function: The Function f(x) of the
discrete variable can be said to be a distribution function such
that
F(x) = P(X≤ x) = ∑𝒏𝒊=𝟏 𝐏(𝐱i) = 1, where ‘i’ is an integer.

You might also like