Revision PDF
Revision PDF
Covers chapters 2 11
5 problems each carrying 20 marks
Must show all calculations involved
Must define any symbols or notations used
Must explain any assumptions made
in=1 x i p i
(discrete)
= xf ( x )dx (continuous)
Binomial Distribution
(p + q) n = pn + n pn-1 q +.. + nCr pn-r qr + .. + qn
n!
r! (n - r)!
1
2
3
4
(p+q)1 = 1p + 1q
4 1
1
3
n=1
1 5 10 10 5 1
Individual probability
4
p
= (0.9) 4
= 0.6561
4p3q = 4(0.9) 3 (0.1) = 0.2916
6p2q2 = 6(0.9) 2 (0.1) 2 = 0.0486
4pq3 = 4(0.9)( 0.1) 3 = 0.0036
q4 = (0.1) 4
= 0.0001
=1
Parallel System
Rs = R1 . R2
product rule of reliability
1
2
Qs = Q1 . Q2
product rule of unreliability
C
E
Evaluation Techniques:
- Tree diagrams
- Event Trees
- Fault Trees
Multi-failure modes
- State enumeration method
- Conditional Probability Method
Probability Distributions
Failure density function, f(t)
Failure distribution function, Q(t) prob. of failure
Survivor function, R(t) = 1 Q(t)
Hazard rate, (t) = f(t)/R(t)
Shape of reliability functions bath tub curve
e t
and Q(t) = 1 -
e t
z=
(t ) x e t
Probability of x failures in time t, Px(t) =
x!
Series System
1
1t
Parallel System
1 (t)dt
et
Qs(t) = [1 - e i t ]
i =1
i =1
[R(t) + Q(t)] =
n
r=0
nCr
(t ) x e t
Px(t) =
x!
R(t) = e
R w (T + t)
x R (T)
w
Markov Model
Markov Approach:
lack of memory, stationary
- discrete (time or space) Discrete Markov Chain
- continuous (time)
Continuous Markov Process
Discrete Markov Chain:
Cumulative states
Both
Up
1 Up
1 Down
Both
Down
Failed
Repaired but
not installed
s = i
U s = i ri
Us
s
rs =
rp =
r1r2
r1 + r2
Up = p .rp
2
2
12
12
2
1
1D
2D
1U
2D