0% found this document useful (0 votes)
22 views

Solutions

Uploaded by

Athmajan Vu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Solutions

Uploaded by

Athmajan Vu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

S-38.

3143 Queueing Theory, II/2007


Exercises
Virtamo / Penttinen

10th December 2007

Contents
1 Discrete and Continuous Distributions 1

2 Distributions and Markov Chains 5

3 Birth&death- and Poisson-processes, Little’s result 8

4 Loss Systems 12

5 Queuing Systems 16

6 Priority queues, queuing networks 20

AB TEKNILLINEN KORKEAKOULU
Helsinki University of Technology
Department of Electrical and Communications Engineering
Networking Laboratory
P.O. Box 3000
FIN-02015 HUT
Exercise 1 DISCRETE AND CONTINUOUS DISTRIBUTIONS

EX 1: Discrete and Continuous Distributions

1. There are two coins one of which is normal and the other one has head on both sides. One of
the coins is chosen randomly and tossed m times and each time the result is head. What is
the probability that the chosen coin is the normal one. Calculate also the numerical value for
m = 1, 2, 3.

Solution:

Denote the possible event with the following symbols:


X the normal coin is chosen, P{X} = 1/2
Y the fake coin is chosen, P{Y } = 1/2
Cm tossing the chosen coin m times gives m heads in a row
Then the conditional probabilities for getting m heads in a row are

P{Cm |X} = (1/2)m ,


P{Cm |Y } = 1.

The asked quantity is P{X|Cm }, so we can apply the Bayes formula:


P{B|Aj }P{Aj }
P{Aj |B} = P ,
i P{B|Ai }P{Ai }
S T
where i Ai = S and Ai Aj = ∅ ∀i 6= j. So in this case we get
P{Cm |X}P{X} (1/2)m · (1/2) 1
P{X|Cm } = = = m .
P{Cm |X}P{X} + P{Cm |Y }P{Y } (1/2) · (1/2) + 1 · (1/2)
m 2 +1
m P{X|Cm }
1 1/3
For m = 1, 2, 3 we get
2 1/5
3 1/9
2. In a Bernoulli trial 1 is obtained with probability p and 0 with probability q = 1 − p. The value
of p is unknown, but it is known that it is drawn from a uniform distribution in (0,1), i.e. all values
in this interval are a priori equally likely. The Bernoulli trial is repeated in order to estimate the
real value of p. In the trials result 0 occurs n0 times and result 1 n1 times. What is the posterior
distribution (Bayes) of p according to the trials? Where is the maximum of the distribution?

Solution:

Let n = n0 + n1 . The probability, that after n trials we have gotten result 1 exactly n1 times is
 
n n1
P{n1 |p} = p (1 − p)n−n1
n1
Denote with f (p) the conditional pdf of p and with f0 (p) the a priori pdf of p. THe Bayes formula
gives
P{n1 |p}f0 (p) P{n1 |p} pn1 (1 − p)n−n1
f (p) = R1 = R1 = R1 ,
0 P{n1 |p}f0 (p)dpP{n1 |p}dp
0 0 p (1 − p)
n1 n−n1 dp

R1
as f0 (p) = 1 (uniform distribution). Denote C(n1 , n) = 0 pn1 (1 − p)n−n1 dp, when

pn1 (1 − p)n−n1
f (p) = .
C(n1 , n)

1
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

The maximum of distribution is obtained by taking the derivate:

1  
f 0 (p) = n1 pn1 −1 (1 − p)n0 − n0 pn1 (1 − p)n0 −1
C
0 ∗ n1
f (p ) = 0 ⇒ n1 (1 − p∗ ) = n0 p∗ ⇒ p∗ = .
n0 + n1

Next we deduce C(n1 , n):

X n   Xn  
Z 1
n n
C(n1 , n)z n1 = (zp)n1 (1 − p)n−n1 dp
n =0
n 1 0 n =0 n 1
1 1
Z 1 Z 1
n
= ((zp) + (1 − p)) dp = ((z − 1)p + 1)n dp
0 0

1 .1 1
= ((z − 1)p + 1)n+1
n+1z−1
0
 n+1 
1 X n1
n
1 z −1
= = z ,
n+1 z−1 n+1
n1 =0

and comparing the factors of z n one sees that


  −1
1 1 n1 !(n − n1 )! n
C(n1 , n) = n =
 = (n + 1) .
n + 1 n1 (n + 1)! n1

Hence, the conditional distribution of p is


 
n n1
f (p) = (n + 1) p (1 − p)n0 .
n1

Furthermore, the expected value of p on condition that n1 is known, is

=C(n1 +1,n+1)
Z 1   zZ 1 }| {
n
E [p|n1 ] = p f (p) dp = (n + 1) pn1 +1 (1 − p)n−n1 dp
0 n1 0

(n + 1) nn1 (n + 1)n!(n1 + 1)!(n − n1 )! n1 + 1
= n+1  = (n + 2)n !(n − n )!(n + 1)! = n + 2 .
(n + 2) n +1
1
1 1

3. Apply the conditioning rules

E [X] = E [E [X|Y ]]
V [X] = E [V [X|Y ]] + V [E [X|Y ]]

to the case X = X1 + . . . + XN , where the Xi are independent and identically distributed


(i.i.d.) random variables with mean m and variance σ2 , and N is a positive integer valued
random variable with mean n and variance ν 2 . Hint: Condition on the value of N .

Solution:

By applying the given conditioning rules we get

E [X] = E [E [X|N ]] = E [N m] = nm.


 
V [X] = E [V [X|N ]] + V [E [X|N ]] = E σ 2 N + V [mN ] = nσ 2 + m2 ν 2 .

2
Exercise 1 DISCRETE AND CONTINUOUS DISTRIBUTIONS

4. Five different applications are transmitting packets in a LAN. For each application i, i =
1, . . . , 5, measurements are carried out to determine the mean, mi, and the standard devia-
tion, σi, of the packet lengths. The measurements provide also a classification of the packets
giving the proportion of packets belonging to each application i (denoted by pi). The results of
the measurements are summarized in the following table. Compute the mean and the standard
deviation of the lengths of all packets in the network.
application pi mi σi
1 0.30 100 10
2 0.15 120 12
3 0.40 200 20
4 0.10 75 5
5 0.05 300 25

Solution:

Let the random variable X denote the packet length and Y denote the application in which the packet
belongs to. Apply the chain rule of expectation and variance:
mi
z }| { X
E [X] = E[E [X|Y ]] = pi mi = 150.5
i
σi2 mi
z }| { z }| {
V [X] = E[V [X|Y ]] + V[E [X|Y ]]
!2
X X X
= pi σi2 + pi m2i − p i mi
i i i
= 245.35 + 26222.5 − 22650.25 = 3817.6.
Thus the packet length in the network has the mean 150.5 and standard deviation 61.8.
5. Assume that the traffic process in a LAN is stationary, whence periods of equal length are sta-
tistically identical. Denote the amount of traffic (kB) carried in the network in consecutive 10
min periods by X1 , X2 and X3 (random variables). In measurements over a long time one has
observed that the variance of the traffic in periods of 10 min, 20 min and 30 min are v1 , v2 ja
v3 , respectively. In other words, v1 = V [X1 ] = V [X2 ] = V [X3 ], v2 = V [X1 + X2 ] =
V [X2 + X3 ] and v3 = V [X1 + X2 + X3 ]. Determine Cov[X1 , X2 ] = Cov[X2 , X3 ] and
Cov[X1 , X3 ] in terms of v1 , v2 and v3 . Hint: e.g. the previous one can be obtained by expanding
v2 = V [X1 + X2 ] = Cov[X1 + X2 , X1 + X2 ].

Solution:

X1 X2 X3
Definition: Cov[A, B] = E [(A − E [A]) · (B − E [B])] = E [AB] − E [A] E [B].
Thus e.g. Cov[A + B, C] = E [AC + BC] − E [A + B] E [C] = Cov[A, C] + Cov[B, C] etc.
(i) Stationary ⇒ Cov[X1 , X2 ] = Cov[X2 , X3 ]
v2 = Cov[X1 + X2 , X1 + X2 ] = Cov[X1 , X1 ] + 2Cov[X1 , X2 ] + Cov[X2 , X2 ]
= 2v1 + 2Cov[X1 , X2 ] ⇒ Cov[X1 , X2 ] = v2 /2 − v1 .
(ii)
v3 = Cov[X1 + X2 + X3 , X1 + X2 + X3 ]
= V [X1 ] + V [X2 ] + V [X3 ] + 2Cov[X1 , X2 ] + 2Cov[X2 , X3 ] + 2Cov[X1 , X3 ]
= 3v1 + 4(v2 /2 − v1 ) + 2Cov[X1 , X3 ] ⇒ Cov[X1 , X3 ] = v1 /2 − v2 + v3 /2.

3
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

Hence, measuring variances v1 , v2 and v3 is enough to determine covariances.

6. The length of the packet (in bytes) arriving to a router is assumed to obey geometric distribution.
The mean length of packet is 100 bytes. Each packet is first read into an input buffer. How large
the input buffer should be in order that an arriving packet fits there with probability of 95 % or
higher?

Solution:

Let random variable X denote the length of the packet. The packet length was assumed to obey a
geometric distribution with mean 100, and thus

P{X = n} = (1 − p)n−1 · p = q n−1 · p,

where p = 1/100. Let the input buffer length be N . The probability that a random packet overflows is
then,
XN X
n−1
1 − qN
P{X ≤ N } = q i−1
p=p qi = p = 1 − qN
1−q
i=1 i=0

from what N can be obtained:

P{X ≤ N } ≥ 0.95
qN ≤ 0.05
N ln q ≤ ln 0.05 (ln q < 0)
N ≥ ln 0.05/ ln q ≈ 298.1,

thus when the buffer length is 299 bytes or more, the arriving packet fits in the buffer with probability
of 95 % or more.

4
Exercise 2 DISTRIBUTIONS AND MARKOV CHAINS

EX 2: Distributions and Markov Chains

1. Let S = X1 + . . . + XN , where Xi ∼ Exp(µ), be i.i.d. and N an independent geometrically


distributed random variable, P{N = k} = (1 − p)pk−1, k = 1, 2, . . .. Determine the tail
distribution of S, G(x) = P{S > x}.

Solution:

First we derive the generating function of geometrical distribution:


X∞   ∞
1−p X 1 − p pz 1−p
GN (z) = z (1 − p)p
k k−1
= (pz)k = =z .
p p 1 − pz 1 − pz
k=1 k=1

Similarly, the generating function of exponential distribution is


Z ∞ Z ∞
−µt −st µ
GX (s) = µe e dt = µe−(µ+s)t dt = .
0 0 µ+s
The generating function of random sum is
µ
µ+s µ (1 − p)µ
S(s) = GN (GX (s)) = (1 − p) µ = (1 − p) = .
1− p µ+s µ + s − pµ (1 − p)µ + s

In other words, the sum S obeys the exponential distribution with parameter (1 − p)µ. Hence, the tail
distribution is
Z ∞ .∞
P{S > x} = (1 − p)µe−(1−p)µt dt = e−(1−p)µt = e−(1−p)µx .
x x

2. Prove (without using the generating function), that the sum of two Poisson random variables,
N1 ∼ Poisson(a1 ) and N2 ∼ Poisson(a2 ), is also Poisson distributed: (N1 +N2 ) ∼ Poisson(a1 +
a2 ). Prove the same result with the aid of generating functions.

Solution:

(ai )n
Poisson-distribution: P {Ni = n} = n! · e−ai .

X
n
P{N = n} = P{N1 + N2 = n} = P{N1 = j} · P{N2 = n − j}
j=0
X
n
(a1 )j (a2 )n−j −a2 e−(a1 +a2 ) X
n
n!
= · e−a1 · ·e = · (a1 )j (a2 )n−j
j! (n − j)! n! j!(n − j)!
j=0 j=0
(a1 + a2 )n ·
= · e−(a1 +a2 ) , (binomial theorem)
n!
thus the sum N = N1 + N2 is Poisson(a1 + a2 ).
On the other hand, let X ∼ Poisson(a). Then the generating function of the random variable X is

X ∞
X
j
j a −a −a (az)j
GX(z) = E[z ] = X
z e =e = e−a eaz = e(z−1)a .
j! j!
j=0 j=0

Let N (z) denote the generating function of sum N1 + N2 , for which we obtain

N (z) = N1 (z) · N2 (z) = e(z−1)a1 · e(z−1)a2 = e(z−1)(a1 +a2 ) .

Hence, N obeys Poisson distribution with parameter a1 + a2 .

5
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

3. Let X be an exponential random variable. Without any computations, tell which one of the
following is correct. Explain your answer.
   
a) E X 2 |X > 1 = E (X + 1)2
   
b) E X 2 |X > 1 = E X 2 + 1
 
c) E X 2 |X > 1 = (1 + E [X])2

Solution:

Clearly a) is true, as from the lack of memory property of exponential distribution it follows that on
condition X > 1 the random variable X is similarly distributed as X + 1.

4. The transition probability matrix of a four state Markov chain is


0 1
0 0 1/2 1/2
B1 0 0 0 C
P= B
@0 1 0
C.
0 A
0 1 0 0

Draw the state transition diagram of the chain and deduce which states are transient and which
are recurrent.

Solution:

Clearly, all the states of the chain are recurrent (process returns to each 0.5
state with probability of 1) and there are no transient states (probability 1 3
of return is less than 1). Formally, let 1
0.5 1
def
pn = P{ chain returns to state 3 during 3n steps}.
1
4 2
Thus,

1
p1 =
2
 2 n  i
X
1 1 1 1 1 1
p2 = + (1 − ) · = + ⇒ pn =
2 2 2 2 2 2
i=1

and at the limit n → ∞ we get,

∞  i
1X 1 1 1
p= = = 1.
2 2 2 1 − 1/2
i=0

Thus the chain returns to state 3 with probability of 1 and the state is recurrent. Similarly for state 4.

5. Time to absorption. A three state Markov chain (with states i = 1, . . . , 3) has the state transition
probability matrix: 0 1
1/4 1/2 1/4
P= @1/2 1/4 1/4A.
0 0 1

State 3 is an absorbing state. Let Ti denote the average time (number of steps) needed by a system
in state i to reach the absorbing state 3 (T3 = 0). Write equations for Ti, i = 1, 2, based on the
facts that the next transition i → j takes one step and due to the Markovian property it takes Tj
steps on the average to reach the absorbing state from state j. Solve the equations.

6
Exercise 2 DISTRIBUTIONS AND MARKOV CHAINS

Solution:

From symmetry it follows that T1 = T2 , and thus 1/2


1/4 1/4
T1 = p11 · (1 + T1 ) + p12 · (1 + T2 ) + p13 · (1 + T3 ) 1/2
= 1 + p11 T1 + p12 T1 (symmetry) 1/4 1/4
3
= 1 + T1 ⇒ T1 = 4.
4
1
6. Define the state of the system at the nth trial of an infinite sequence of Bernoulli(p) trials to be
the number of consecutive succesful trials preceeding and the current trial, i.e. the state is the
distance to previous unsuccesful trial. If the nth trial is unsuccesful, then Xn = 0; if it succeeds
but the previous one was unsuccesful, then Xn = 1, etc. a) What is the state space of the system?
b) Argue that Xn forms a Markov chain. c) Write down the transition probability matrix of the
Markov chain (give its structure).

Solution:

(a) Xn can be any integer from 0 to ∞


(b) The experiments are independent:

Xn + 1 with probability of p,
Xn+1 =
0 with probability of 1 − p,

that is the state of the system at time n + 1 depends only on the state at time n. That is the Markov
property and hence the system is a Markov chain.
(c)  
1 − p p 0 0 ···
1 − p 0 p 0 · · ·
 
P = 1 − p 0 0 p · · ·
 
.. .. .. .. . .
. . . . .

7
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

EX 3: Birth&death- and Poisson-processes, Little’s result

1. Determine the probability distribution in equilibrium for birth-death processes (state space i =
0, 1, 2, . . . ), which transition intensities are a) λi = λ, µi = iµ, b) λi = λ/(i + 1), µi = µ,
where λ and µ are constants.

Solution:

λ
λ i+1
i i+1 i i+1
(i+1) µ µ

Figure 1: Transition intensities between the states i and i + 1.

(a) From fig. it can be seen that

1 λ a
(i + 1)µπi+1 = λπi ⇒ πi+1 = πi = πi .
i+1µ i+1
That is

π1 = aπ0 ,
a a2
π2 = π1 = π0 ,
2 2!
..
.
ai
πi = π0 .
i!
P
Furthermore, i πi = 1, i.e.
X X ai
πi = π0 = π0 ea ⇒ π0 = e−a .
i!
i i

So
ai −a
e . πi =
i!
(that is a Poisson-distribution with parameter a = λ/µ)
(b) In this case we obtain
λ a
πi = µπi+1 ⇔ πi+1 = πi
i+1 i+1
that is the probability distribution in equilibrium is the same as in (a).

2. In a game audio signals arrive in the interval (0, T ) according to a Poisson process with rate λ,
where T > 1/λ. The player wins only if there will be at least one audio signal in that interval
and he pushes a button (only one push allowed) upon the last of the signals. The player uses the
following strategy: he pushes the button upon the arrival of the first (if any) signal after a fixed
time s ≤ T .
a) What is the probability that the player wins?
b) What value of s maximizes the probability of winning, and what is the probability in this
case?

8
Exercise 3 BIRTH&DEATH- AND POISSON-PROCESSES, LITTLE’S RESULT

Solution:

Player bets on that during the time (s, T ) there is exactly one arrival (what happened during (0, s) has
no effect here). Let τ = T − s, when the number of arrivals obeys Poisson distribution with parameter
a = λτ . Here, 0 ≤ τ ≤ T , i.e. 0 ≤ a ≤ λT where λT > 1.

a1 −a
a) pv = P{N (T ) − N (s) = 1} = e = ae−a , where a = λ(T − s).
1!
b) Maximum can be found by taking the first derivate in relative to a:

d
pv (a) = e−a − ae−a = (1 − a)e−a .
da
The derivate has clearly exactly one root, a = 1, which is also the maximum of the function (first
strictly increasing and then stricly decreasing function).

a=1 ⇒ λ(T − s) = 1 ⇒ s = T − 1/λ,

which also lies inside the allowed interval. The maximum probability of winning is hence 1/e.

3. It has been observed that in the interval (0, t) one arrival has occurred from a Poisson process,
i.e. N (0, t) = 1. Prove that conditioned on this information, the arrival time τ is uniformly
distributed in (0, t). Hint: determine the conditional cumulative distribution function of the
arrival time P{τ ≤ s | one arrival during (0, t)}.

Solution:

P{τ ≤ s and N (0, t) = 1} P{N (0, s) = 1 and N (s, t) = 0}


P{τ ≤ s|N (0, t) = 1} = =
P{N (0, t) = 1} P{N (0, t) = 1}
0
λs −λs (λ(t − s)) −λ(t−s)
e · e
= 1! 0! = s/t, Thus, τ ∼ Uniform(0, t).
λt −λt
e
1!

4. Customers arrive at the system according to a Poisson process with the intensity λ. Each cus-
tomer brings in a revenue Y (independently of other customers), which is assumed be an integer
with the distribution pi = P{Y = i}, i = 1, 2, . . .. Let Xt denote the total revenue gained
during the time interval (0, t).
a) Derive expressions for E [Xt] and V [Xt].
b) Deduce that Xt ∼ E1 + 2E2 + 3E3 + · · · , where the Ei are independent random variables
with the distributions Ei ∼ Poisson(piλt).

Solution:

Denote
Xt = total revenue on the interval (0, t),
Nt = number of customers on the interval (0, t), Nt ∼ Poisson(λt).

(a) Clearly, the system revenue is a random sum

Xt = Y1 + . . . + YNt ,

9
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

where the Yi are i.i.d. random variables. Conditioning on the value of Nt we can apply the chain
rules for expectation and variance:
 
E [Xt ] = E [E [Xt |Nt ]] = E Nt Y = E [Nt ] E [Y ] = λt E [Y ] .

Variance:
 
V [Xt ] = V [E [Xt |Nt ]] + E [V [Xt |Nt ]] = V Nt Y + E [Nt V [Y ]]
   
= V [Nt ] E [Y ]2 + E [Nt ] V [Y ] = λt E [Y ]2 + V [Y ] = λt E Y 2 .

λt
t

p1 λt
t

p λt
2
t

Figure 2: Random splitting of the Poisson process

(b) On the interval (0, t) the customers arrive according to the Poisson process Nt . Random splitting1
of this process with probabilities pi leads to new independent Poisson processes Ei with the
respective parameters pi · λt. The customers of "class i" brought in the revenue i. Thus,
X
Xt = i · Ei .
i

5. Million (106 ) data packets per second arrive at a network from different sources. The lengths
of routes, defined by the source and destination addresses, vary considerably. The time a packet
spends in the network depends on the length of the route, but also on the congestion of the net-
work. The distribution of the time a packet spends in the network is assumed to have the follow-
ing distribution: 1 ms (90 %), 10 ms (7 %), 100 ms (3 %). How many packets there are in the
network on average?

Solution:

transport delay proportion arrival intensity


1ms 90% 0.9 · 106 /s
10ms 7% 0.07 · 106 /s
100ms 3% 0.03 · 106 /s
So arrival intensity and average transport delay is known for for each traffic class. Next Little’s result,
N = λT , can be applied:

N1 = 1 ms · 0.9 · 106 /s = 900,


N2 = 10 ms · 0.07 · 106 /s = 700,
N3 = 100 ms · 0.03 · 106 /s = 3000,
⇒ 4600.
X
1
The process is first split into two processes with the probability p1 of which the latter is split again with the probability p2 / pi
i≥2
etc.

10
Exercise 3 BIRTH&DEATH- AND POISSON-PROCESSES, LITTLE’S RESULT

6. The states of some irreducible Markov-process, which steady state probabilities πi are assumed
to be known, can be partioned into two disjoint sets, A = {1, 2, . . . , n} and B = {n + 1, n +
2, . . .}, so that the transition rates between the sets are non-zero only from state n to state n + 1
and the opposite direction, i.e. qn,n+1 = λ and qn+1,n = µ.
Using the Little’s result write down an equation for the average time it takes from the transition
n → n + 1 to the time the system returns back to set A (transition n + 1 → n), i.e. for the
average time the system spends at set B.

Solution:

Define a new system to which a “customer” arrives A B


when the original system moves from state n to state
n + 1, i.e. a transition A → B is interpreted as an ar- λ
rival of a customer in a new system. Similarly, the tran- n n+1
sition B → A in the original system is interpreted as a µ

departure of the customer in the new system.


Little’s formula: N = λ0 · T , where λ0 is the arrival rate of customers.
In the new system the average number of customers is clearly,

X
N = 1 · P{B} = πi .
i=n+1

Similarly, the average sojourn time T in the new system is the time from event A → B to event B → A,
i.e. the quantity asked. The arrival intensity of the customers, λ0 , is

λ0 = λπn .

Thus, P∞ Pn
N i=n+1 πi 1− i=1 πi
T = 0 = = .
λ λπn λπn

11
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

EX 4: Loss Systems

1. Consider a car-inspection center where cars arrive at a rate of one every 50 seconds and wait
for an average of 15 minutes (inclusive of inspection time) to receive their inspections. After an
inspection, 20 % of the car owners stay back an average of 10 minutes having their meals at
the center’s cafeteria. What is the average number of cars within the premises of the inspection
center (inclusive of the cafeteria)?

Solution:

Little’s theorem: N = λW .
The average time spent in the system is W = 15min + 0.2 · 10min = 17min, and thus,
1 6
N= · 17min = · 17min ≈ 20.4.
50s 5min
2. Consider a communications link with unbounded capacity. At t = 0 the link is empty and
calls start arriving according to Poisson process with the intensity λ. The call holding times
are assumed to be independent and exponentially distributed with the mean 1/µ. Consider the
number of on-going calls Nt at the time instant t ≥ 0. Determine the distribution of Nt and,
in particular, its mean as a function of time t. Hint: The number of on-going calls equals the
number of arrivals between (0, t) from a certain inhomogeneous Poisson process, which can be
obtained from the original Poisson process with a suitable random selection.

Solution:

1
p(x)
2

0 t

Figure 3: Random selection from a Poisson process.

The original process is a Poisson process with the intensity λ. We make a random selection of the
original process by selecting calls with the probability that they are still in the system at the time t. In
other words,

p(x) = P{call arriving at time x is still in the system at time t} = e−µ(t−x) .

The resulting inhomogeneous Poisson process2 has the intensity

λ2 (t) = λ p(x) = λe−µ(t−x) .

According to the lecture notes, the number of arrivals on an interval (0, t) in an inhomogeneous Poisson
process obeys the Poisson distribution with the parameter
Z .t 1
t
−µ(t−x) −µt λ  λ 
a(t) = λe dx = λe eµx = e−µt eµt − 1 = 1 − e−µt .
0 µ µ µ
0

The mean of the distribution is the parameter itself, so the mean number of active calls at time t is given
by λµ 1 − e−µt , which has the limit value µλ , as it should.
2
arrival intesity is a deterministic function of time

12
Exercise 4 LOSS SYSTEMS

3. A mail order company has 3 persons serving incoming calls. The calls arrive according to a
Poisson process with intensity of 1/min and the average call duration is 2 min.

a) What is the probability that an incoming call is blocked, when blocked calls are lost?
b) Is it profitable to hire a 4th person if the total expenses per person are 100 €/h and the
average revenue per order is 20 €?

Solution:

The system is Erlang’s loss system with 3 or 4 servers (M/M/K/K-system, where K = 3 or 4). The
offered load a is a = λ · x = 1/min · 2 min = 2. Because we are considering a Poisson process, the
call blocking and time blocking are the same.

a) The probability of state 3 (K = 3) can be obtained by calculating or by looking from a graph


approximately:
a3 /3! 4
E(3, a) = 2 3
= ≈ 0.211
1 + a/1! + a /2! + a /3! 19
b) In this case K = 4. The fourth person is useful when there are 4 persons in service. The
probability of system being in state 4 is

a4 /4! 2
E(4, a) = = ≈ 0.095
1 + a/1! + a2 /2! + a3 /3! + a4 /4! 21

The blocked traffic lowers, i.e. the served traffic grows, with amount λ · (E(3, a) − E(4, a)).
Thus, the net change is

∆ = λ · (E(3, a) − E(4, a)) · 20 € − 100 €/h ≈ 38.3 €/h.

Hence, the 4th person is worth hiring.

4. Use recursive Erlang’s blocking formula to calculate E(n, 6) for n = 0, . . . , 6.

Solution:

a · E(n − 1, a)
Recursive Erlang’s blocking formula: E(n, a) = , and E(0, a) = 1.
n + a · E(n − 1, a)
Applying the recursive formula gives:
6·36/61
E(0, 6) = 1, E(4, 6) = 4+6·36/61 = 54/115,
6 6·54/115
E(1, 6) = 1+6 = 6/7, E(5, 6) = 5+6·54/115 = 324/899,
6·6/7 6·324/899
E(2, 6) = 2+6·6/7 = 18/25, E(6, 6) = 6+6·324/899 = 324/1223.
6·18/25
E(3, 6) = 3+6·18/25 = 36/61,

5. Consider Erlang’s loss system with n servers, where the intensity of the offered traffic is a.

a) Show by direct calculations based on the steady state probabilities, that the mean number of
customers in the system N is equal to (1−E(n, a))a. Explain this by using Little’s theorem.
b) By using Little’s theorem, deduce the mean time the system stays at state N = n at a time.
(Hint: How often the system moves to state N = n? The mean number of customers in
a given state is the same as steady state probability of the same state; the system either is
in that particular state or not.) Deduce the same result directly from the properties of the
exponential distribution.

13
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

Solution:

a) For the steady state distribution of Erlang’s blocking system it holds that

ak /k!
π k = Pn j
j=0 a /j!

For N one gets,

X
n X
n Pn
k=0 k · a /k!
k
ak /k!
N= k · πk = k · Pn j
= P n j
k=0 k=0 j=0 a /j! j=0 a /j!
Pn Pn−1 k+1 P
ak /(k − 1)! a /k! a nk=0 ak /k! − an+1 /n!
= Pn
k=1
j
= Pnk=0
j
= Pn j
j=0 a /j! j=0 a /j! j=0 a /j!
an /n!
= a − a · Pn j
= a(1 − E(n, a)).
j=0 a /j!

b) Define first a new stochastic process Yt as follows. Let Xt denote the state of the original system at
time t and let Yt denote
 the “occupancy” of state n:
1, when Xt = n,
Yt =
0, otherwise.
Then, a transition to state n in the original process
Xt corresponds to an “arrival” in Yt , and similarly a
πn−1 nµ πn
transition away from state n corresponds to a depar- n−1 n
ture in Yt . In the equilibrium the “arriving traffic” to
state n is λY = πn−1 λ. On the hand, the average λ
“occupancy” of process Yt is
Figure 4: Transitions to and from state n.
N Y = 0 · (1 − πn ) + 1 · πn = πn .

For this we can apply Little’s result: N = λW

πn = πn−1 λ · W Y
n
Pna /n!j
πn j=0 a /j! an /n! a 1
WY = = an−1 /n−1!
= λ= = .
πn−1 λ P n j λ an−1 /n − 1! nλ nµ
j=0 a /j!

On the other hand, the sojourn time in state n is exponentially distributed with parameter nµ (no
1
higher states), and thus the mean sojourn time must be equal, W Y = nµ .

6. Consider 2 × 1- and 4 × 2-concentrators, where for each input port calls arrive according to
independent Poisson-processes with intensities γ. The mean call holding time is denoted by 1/µ
and the offered load by â = γ/µ = 0.1. Compare in these two concentrators the probabilities
that a call arriving to a free input port gets blocked because all the output ports are busy.

Solution:

The system corresponds to an Engset’s system with n sources and s servers, 0 ≤ s ≤ n. Let πj [n],
j = 0, . . . , s, denote the steady state probabilities. Similarly, let πj∗ [n], j = 0, . . . , s, denote the state
probabilities seen by an arriving customer.

14
Exercise 4 LOSS SYSTEMS

Time blocking in Engset’s system is generally the same as the steady state probability of state s,

     
n s n s
â  p (1 − p)n−s 
s  s â 
πs [n] = s   = where p = .
 Xs   ,
X n  n k 1 + â 

âk p (1 − p)n−k
k k
k=0 k=0

Similarly, for call blocking it holds that


 
n−1 s

∗ s
πs [n] = πs [n − 1] = s 
X n − 1
.
k

k
k=0

Substituting n and s to above gives (p ≈ 0.091)

n s time blocking call blocking


2 1 16.7% 9.1%
4 2 4.1% 2.3%

15
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

EX 5: Queuing Systems

1. Assume that there is always one doctor at a clinic (24 hours a day) and that customers arrive
according to a Poisson process with intensity λ. The service time of each customer obeys an
exponential distribution with mean 30 minutes. Determine the maximum arrival intensity that
allows 95% of the customers to be served within 72 hours from (the first) arrival.

Solution:

For the M/M/1 queue, we know that (see lectures)

P{W > t} = ρ · e−(µ−λ)t ,

which now yields the constraint

P{W > 72} = λ/2 · e−(2−λ)·72 ≤ 5/100


⇔ λe72λ ≤ e144 /10.

The maximum λ is obtained from the corresponding equality which can be solved numerically (it may
be wise to take the logarithm on both sides of the equality first). The solution is

λmax ≈ 1.9587

2. Show that in an M/G/1-LIFO queue the equilibrium distribution of the queue length, πn =
(1 − ρ)ρn, is insensitive to the holding time distribution. Hint: Reason that a customer who
upon arrival finds j − 1 customers in the system will be exclusively served when the system is in
state N = j. Thus, the time the system spends in the state N = j is completely composed of the
full service times of the customers who arrive to the system in the state N = j − 1. How many
such arrivals occur in a long interval of time T ?

Solution:

During a long time interval T there is on average λπj−1 T arrivals to the state j . Because those
customers are only served when the system is in state j, the system remains in that state on average the
time λπj−1 T S, where S is the average service time of one customer. That must be the same T πj , and
it follows that
T πj = λπj−1 T S ⇒ πj = ρπj−1 ,
which is the same as with the ordinary M/M/1-queue.

3. Consider an M/M/1/K queue with states 0, 1, . . . , K. Find the probability Pn that the queue
which initially is in the state n becomes empty before flowing over. Hint: Add an imaginary state
K + 1 to the system; a transition to the state K + 1 corresponds to the queue flowing over. We
have P0 = 1 and PK+1 = 0. For states n = 1, . . . , K, write the probability Pn in terms of
Pn−1 ja Pn+1 . Solve the equations.

Solution:

State n is followed by state n − 1 with probability µ/(λ + µ) and by state n + 1 with probability
λ/(λ + µ). By the Markov property, conditioning on the next state yields

µ λ
Pn = Pn−1 + Pn+1 ⇔ λ(Pn − Pn+1 ) = µ(Pn−1 − Pn ).
λ+µ λ+µ

16
Exercise 5 QUEUING SYSTEMS

Defining Dn = Pn − Pn+1 allows writing this in the form

λ
Dn−1 = Dn = ρDn ⇒ DK−i = ρi DK .
µ

On the other hand, we have

X
K−n
DK−i = (PK − PK+1 ) + (PK−1 − PK ) + . . . + (Pn+1 − Pn+2 ) + (Pn − Pn+1 ) = Pn − PK+1
i=0
X
K−n
= Pn = DK ρi ,
i=0

which, when written for P0 , allows determining DK :


(
X
K 1−ρ
, ρ 6= 1
DK ρi = P0 = 1 ⇔ DK = 1−ρK+1
1
i=0 K+1 , ρ = 1.

4. Customers arrive at a two-server system according to a Poisson process having rate λ = 5/min.
An arrival finding server 1 free will begin service with that server. An arrival finding server 1
busy and server 2 free will enter service with server 2. An arrival finding both servers busy goes
away. Once a customer is served by either server, he departs the system. The service times of the
servers are exponential with rates µ1 = 4/min and µ2 = 2/min. a) What is the average time an
entering customer spends in the system? b) What proportion of time is server 2 busy?

Solution:

λ
From the Fig. it can obtained the following equa-
2 3
tions:
µ1
2(π2 + π3 ) = 5π1 (5.1) µ2 λ µ2
4(π1 + π3 ) = 5(π0 + π2 ) (5.2) λ
2π2 + 4π1 = 5π0 (5.3)
0 1
µ1
Furthermore, the normalization condition,

π0 + π1 + π2 + π3 = 1 (5.4)

By combining (5.2) and (5.4) one gets 3

4(π1 + π3 ) = 5(1 − (π1 + π3 ))


π1 + π3 = 5/9 ⇒ π0 + π2 = 4/9.

Similarly from (5.1) gives,

2(π2 + π3 ) = 5π1
2 − 2(π0 + π1 ) = 5π1 ⇒ 2π0 + 7π1 = 2,

3
Server 1 itself behaves like normal M/M/1/1 system.

17
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

and from (5.1) and (5.4),

2π2 + 4π1 = 5π0


2(π0 + π2 ) + 4π1 = 7π0
8/9 + 4π1 = 7π0
16 + 72π1 = 126π0
110
16 + 72π1 = 126 − 441π1 ⇒ π1 = .
513
From what it follows,
128 100 175
π0 = , π2 = and π3 = .
513 513 513
a) The average time spent in system:
128 + 100 1 110 1
· + ·
T = 513 4 513 2 = 57 + 55 = 112 = 56 ≈ 0.33.
128 + 100 110 228 + 110 338 169
+
513 513
b) The server 2 is reserved:
275
π2 + π3 = ≈ 0.536.
513
5. Customers arrive at an M/Erlang(k, µ)/1 system according to a Poisson process with rate λ.
Find the mean waiting and sojourn times of a customer in the system?

Solution:

Erlang(k, µ)-distributed random variable is a sum of k Exp(µ)-random variables. Thus,

E [X ] = k/µ,
V [X ] = k/µ2 ,
 
E X 2 = V [X ] + E [X ]2 = k/µ2 + k2 /µ2 = (k + k2 )/µ.

In this case we have a M/G/1-queue so we can apply Pollaczek-Khinchin mean value formulas. The
offered load is ρ = λX = k λµ and thus the mean sojourn time is
2  
λX 2 k λ k µ+k
2 µ k 1 λ(k + 1)
T =x+ = + = 1+ .
2(1 − ρ) µ 2(µ − kλ) µ 2 µ − kλ
Similarly, the mean waiting time is
2
λX 2 λ k µ+k
2 µ λk(k + 1)
W = = = .
2(1 − ρ) 2(µ − kλ) 2µ(µ − kλ)

6. If in a single server system each customer has to pay a fee to the system according to some rule,
then the average revenue rate of the system = λ · (average fee), where λ is the mean rate of
arriving customers.
Apply this to the M/G/1 system with the following charging rule: each customer in the system
pays at rate which is the same as the customer’s remaining service time. What is the average fee?
Show by equating the above average revenue rate with the average charging rate (time charging)
that
W = λ (X W + X 2 /2),
where W and X are the waiting and service times. Solve W . What is this result?

18
Exercise 5 QUEUING SYSTEMS

Solution:

Let, 
W unfinished work in the queue when customer arrives, and
U unfinished work in the queue at an arbitary moment.
We have a M/G/1 queue and the fee rate is equal to the remaining service time. The average fee is
1
X · W + X 2,
2
where X and W are independent random variables (the fee rate during the waiting is equal to the
service time, and the fee rate drops linearly to zero when customer is under service) and thus the
average revenue rate is λ(X · W + 12 X 2 ). On the other hand, this is the same as the virtual waiting
time, i.e. the waiting time of the customer if he would arrive to the system at the given moment of time.
It follows from the PASTA property that the real waiting times W obey the same distribution, i.e.
U ∼ W and U = W . Thus,
1
W = U = λ(X · W + X 2 ),
2
from which it can be deduced that

λX 2 λX 2 λX 2
W (1 − λX) = ⇒ W = = ,
2 2(1 − λX) 2(1 − ρ)

which is P-K mean value formula.

19
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

EX 6: Priority queues, queuing networks

1. Carloads of customers arrive at a single-server station in accordance with a Poisson process with
rate 4 per hour. The service times are exponentially distributed with mean 3 min. If each carload
contains either 1, 2, or 3 customers with respective probabilities 14 , 12 , 14 , compute the average
customer waiting time in the queue. Hint: The waiting time of the first customer of each group
can be obtained from an appropriate M/G/1 queue. Consider separately the “internal” waiting
time in the group.

Solution:

Consider first the whole batch as one unit and later the waiting time inside batch can be added to it. For
the whole batch one can apply M/G/1-queue model. It was given that

λ = 4/h = 1/15min, and 1/µ = 3min.

The service time of the batch is,

S = X1 + . . . + XN , where N is 1, 2 or 3.

For the number of customers, N , in one group it holds that,

E [N ] = 1 · (1/4) + 2 · (1/2) + 3 · (1/4) = 2


 
V [N ] = E (N − E [N ])2 = 1 · (1/4) + 0 · (1/2) + 1 · (1/4) = 1/2.

from which, by using the conditioning rule (or tower property), one gets for the service time S that,

E [S] = E [E [S|N ]] = E [N · 3min] = 6min,


 
V [S] = E [V [S|N ]] + V [E [S|N ]] = E N · 9min2 + V [N · 3min]
9
= 18min2 + min2 = 45/2min2
2
 2 117
E S = V [S] + E [S]2 = 45/2min2 + 36min2 = min2 .
2
The average waiting time of batch can be obtained by using the Pollaczek-Khinchin formula,
  1 117 117
λE S 2 min2 13
E [Wg ] = = 15min 12 = 306 min = min = 3min 15s.
2(1 − λE [S]) 2(1 − 15min 6min) 5
4

In addition the customer sometimes have to wait within his batch. This is easiest to obtain by deter-
mining the total waiting time inside one batch and then dividing that by the average size of the batch:

E [Wc ] = (1/4 · 0 + 1/2 · 1 + 1/4 · 3) · 3min/2 = 15/8min = 1min 52.5s.

(Note. When batch size is 3, the average waiting time is (0 + 1 + 2) · 3min = 3 · 3min.)
By adding the waiting times together one gets the total average waiting time of customer, which is

E [W ] = E [Wg ] + E [Wc ] = 5min 7.5s.

2. Persons arrive at a copying machine according to a Poisson process with rate 1/min. The number
of copies to be made by each person is uniformly distributed between 1 and 10. Each copy takes
3 s. Find the average waiting time in the queue when

a) Each person uses the machine on a first-come first-served basis.


b) Persons with no more than 2 copies to make are given non-preemptive priority over other
persons.

20
Exercise 6 PRIORITY QUEUES, QUEUING NETWORKS

Solution:

a) We will apply the P-K formula. Now S = X · 3s, where X is the number of copies needed. We
have
10 + 1 33
E [S] = E [X] · 3s = 3s = s,
2 2
10
 2  2 2 1 X 2 77 · 9 2
E S = E X · 9s = i · 9s2 = s .
10 2
i=1

The load of the system is


1 33 11
ρ = λ · E [S] = · = ,
60 2 40
 
and the average waiting time of a customer is W = R/(1 − ρ), where R = λE S 2 /2 =
77 · 9/(2 · 2 · 60)s = 231/80s is the average amount of unfinished work (service time) in the
server. Thus,
231
231
W = 80 11 s = s ≈ 3.98s.
1 − 40 58

b) Customers have two priority classes (m = 0, n = 2 and m = 2, n = 8):

λk E [Sk ] ρk
1 9
class 1 300 s 2 3/200
4
class 2 300 s 3(2 + 92 ) = 39
2 156/600 = 13/50

In the prioritized system, the class-specific mean waiting times are

R
W1 = ≈ 2.93s,
1 − ρ1
R 1
W2 = = W1 · ≈ 4.04s.
(1 − ρ1 )(1 − ρ1 − ρ2 ) 1 − ρ 1 − ρ2

(R is the same as above; the order in which customers are served does not affect the average
residual work.) The average waiting time is

λ 1 W 1 + λ2 W 2
W = ≈ 3.82s.
λ1 + λ 2

3. Consider a priority queue with two classes and preemptive resume priority. Customers arrive
according to two independent Poisson processes with intensities λ1 and λ2 . Service times in both
classes are independent and exponentially distributed with a joint mean 1/µ. Determine the
mean sojourn times T̄1 and T̄2 for both classes.

Solution:

For preemptive priority queue it holds that (cf. lecture notes),

(1 − ρ1 )S 1 + R1
T1 = ,
1 − ρ1
(1 − ρ1 − ρ2 )S 2 + R2
T2 = ,
(1 − ρ1 )(1 − ρ1 − ρ2 )

21
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

1 Pk
where Rk = 2 i=1 λi · Si2 . In this case,

S 1 = S 2 = 1/µ, ρ1 = λ1 /µ,
S 2 1 = S 2 2 = 2/µ2 , ρ2 = λ2 /µ,

and thus,
1
R1 = λ1 · 2/µ2 = λ1 /µ2 ,
2
1 
R2 = λ1 · 2/µ2 + λ2 · 2/µ2 = (λ1 + λ2 )/µ2 ,
2
from what one obtains,
(1 − λ1 /µ)(1/µ) + λ1 /µ2 µ − λ 1 + λ1 1
T1 = = 2
= ,
1 − λ1 /µ µ − λ1 µ µ − λ1
(1 − (λ1 + λ2 )/µ)(1/µ) + (λ1 + λ2 )/µ2 µ − λ 1 − λ2 + λ1 + λ2
T2 = =
(1 − λ1 /µ)(1 − (λ1 + λ2 )/µ) (µ − λ1 )(µ − λ1 − λ2 )
µ
= .
(µ − λ1 )(µ − λ1 − λ2 )

Note that T 1 is equal to the mean sojourn time in M/M/1-queue, as it should be.

4. The Pollaczek-Khinchin formula for the Laplace transform of the waiting time W is
s(1 − ρ)
W ∗(s) =
s − λ + λS ∗(s)

where S ∗(s) is the Laplace transform of the service time S and ρ = λS. Using this result,
rederive the PK mean formula for the waiting time.

Solution:

The mean of random variable W is


 
d ∗
W = − W (s)
ds s=0

Similarly, for the service time S it holds that


   
d ∗ d ∗
S = − S (s) and S2 = S (s) .
ds s=0 ds2 s=0

Taylor serie of S ∗ (s) is


  
∗ ∗d ∗ 1 d ∗ 1
S (s) ≈ S (0) + S (s) ·s + S (s) ·s2 = 1 − S · s + S 2 · s2
| {z } ds 2 ds 2 2
=1 | {z s=0} | {z s=0}
=−S =S 2

Thus,
s(1 − ρ) s(1 − ρ)
W ∗ (s) = ∗
=
s − λ + λS (s) s − λ + λ(1 − S · s + 12 S 2 · s2 )
s(1 − ρ) 1
= 1
= ,
s(1 − ρ) + 2 λS 2 · s2 λS 2
1 + 12 1−ρ ·s

from which it follows that  


d 1 λS 2
W = − W ∗ (s) = .
ds s=0 21−ρ

22
Exercise 6 PRIORITY QUEUES, QUEUING NETWORKS

5. Queues 1 and 2 of the open Jackson queueing network depicted in the figure receive Poissonian
arrival streams with rates 2 and 1 (customers/s). Service times are exponentially distributed
with the given rates (customers/s). Calculate a) customer streams through each of the queues, b)
average occupancies of the queues and the average total number of customers in the network, c)
mean delays in the network of customers arriving at queues 1 and 2 as well as the delay of an
arriving customer chosen at random.
queue 1
2 3
queue 3
1/2
10

queue 2 1/2

1 6

Solution:

a) Let the traffic stream fed back to system be x customers per second. Then the traffic stream to
server 3 is 2x, from which half goes out. The traffic flow in and out of the system must be equal.
Hence, x = 3 and  
 λ1 = 2  ρ1 = 2/3
λ2 = 4 ⇒ ρ = 2/3
  2
λ3 = 6 ρ3 = 3/5
b) Jacksons theorem states that queues of the network behave like they were independent and the
offered traffic were Poissonian, i.e.
Y
P{N = n} = P{Ni = ni }, where P{Ni = ni } = (1 − ρi )ρni i .
i

In this problem, (
1

2 i
P{N1 = i} = P{N2 = i} = 3 · 3 ,
i
P{N3 = i} = 15 · 35 ,
so the average number of customers in each server are
(
ρ1
N 1 = N 2 = 1−ρ = 2,
ρ3
1
⇒ N = N 1 + N 2 + N 3 = 11/2.
N 3 = 1−ρ 3
= 3/2.

c) The average sojourn times in each network node is



Ni  T1 = 1
Ti = ⇒ T = 1/2
λi  2
T 3 = 1/4

A customer arriving to the queue i sojourn time in the network is T i,d , for which it holds that
X
T i,d = T i + qij T j,d,
j

where qij is the probability that a customer leaving from queue i enters to queue j. Denote
T i,d = {a, b, c }. Then the set of linear equations can be written as
 

 a = 1+c 
 T 1,d = 2
1 ⇒
b = 2 +c T 2,d = 3/2

 1 1


c = 4 + 2b T 3,d = 1

23
S-38.3143 Queueing Theory, II/2007 Virtamo / Penttinen

Furthermore, a random customer stays in the network on average (Little’s result)


T = N /λ = 11/6.
6. Consider a cyclic closed network consisting of two queues. The service times in the queues are
exponentially distributed with parameters µ1 and µ2 . There are three customers circulating in
the network. a) Draw the state transition diagram of the network (four states). b) Determine
the equilibrium probabilities and calculate the mean queue lengths. c) Calculate the customer
stream in the network (e.g. the customer stream departing from queue 1). d) Rederive the results
of b) and c) by means of the mean value analysis (MVA).

Solution:

µ µ µ
1 1 1

3,0 2,1 1,2 0,3

µ µ µ
2 2 2

Figure 5: State diagram of the system.

a) State diagram of the system is depicted in Fig. 5.


b) In steady state it holds that
µ1 π0 = µ2 π1 π1 = ρπ0
µ1 π1 = µ2 π2 ⇒ π 2 = ρ2 π 0
µ1 π2 = µ2 π3 π 3 = ρ3 π 0
Normalization:
1
(1 + ρ + ρ2 + ρ3 )π0 = 1 ⇒ π0 = .
1 + ρ + ρ 2 + ρ3
The average queue lengths are
3 + 2ρ + ρ2
N1 = (3 + 2ρ + ρ2 )π0 =
1 + ρ + ρ 2 + ρ3
3ρ3 + 2ρ2 + ρ
N2 = (3ρ3 + 2ρ2 + ρ)π0 = . (N1 + N2 = 3)
1 + ρ + ρ 2 + ρ3
c) The traffic flow is
1 + ρ + ρ2 1 − ρ3 1 − ρ 1 − ρ3
λ = (π0 + π1 + π2 )µ1 = µ 1 = µ 1 = µ1 .
1 + ρ + ρ 2 + ρ3 1 − ρ 1 − ρ4 1 − ρ4
d) Mean value analysis (MVA):

 Ti [k] = (1 + Ni [k − 1]) /µ1

Ni [k] = k PλiλTjiT[k]
j [k]

 j
λi [k] = Ni [k]/Ti [k]
Here,
  h i  
 N [0] = [0, 0]
  T [2] = 1/µ1 2+ρ , 1+2ρ ρ = 1
2 + ρ, ρ + 2ρ2
T [1] = h1/µ1 [1, ρ]i h 1+ρ 1+ρ i h
µ1 (1+ρ) i
  N [2] = 2 2+ρ ρ+2ρ2 2+ρ ρ+2ρ2
 N [1] = 1 ρ , = ,
1+ρ , 1+ρ
2
2+2ρ+2ρ 2+2ρ+2ρ 2 2
1+ρ+ρ 1+ρ+ρ 2

 h i
 1+ρ+ρ2 +2+ρ 2 + ρ + 2ρ2 1 + ρ + ρ2 ρ

 T [3] = 1/µ 1 1+ρ+ρ 2 , 1 + ρ + ρ

 1
 2 2 3

= µ1 (1+ρ+ρ 2 ) 3 + 2ρ + ρ , ρ + 2ρ + 3ρ
h i

 N [3] = 3 3+2ρ+ρ2
, ρ+2ρ2 +3ρ3

 2 3
 3+3ρ+3ρ +3ρ 3+3ρ+3ρ +3ρ
2

3

= 3 + 2ρ + ρ2 , ρ + 2ρ2 + 3ρ3 π0

24
Exercise 6

Similarly, the traffic flow becomes

3 + 2ρ + ρ2 µ1 (1 + ρ + ρ2 ) µ1 (1 + ρ + ρ2 ) 1 − ρ3
λ = λ1 [3] = = = µ1 .
1 + ρ + ρ2 + ρ3 3 + 2ρ + ρ2 1 + ρ + ρ 2 + ρ3 1 − ρ4

25

You might also like