Homework3 Solutions
Homework3 Solutions
1.
Method 1: Please refer to figure 3.51 of the textbook.
Suppose, that the plot starts at A, with AIAD, the plot can only move back and
forth along line AB, which is parallel to the Equal bandwidth share line. There is no
movement of the plot in the direction of the Full bandwidth utilization line. So, the
AIAD method generally will not converge to fairness.
With MIMD, The plot can only move back and forth along line BC, so MIMD
will not converge to fairness either.
Method2:
In the paper, convergence to fairness is defined as moving towards the fairness index of
one as time goes to positive infinity.(page 8)
F ( x(t )) → 1 as t → ∞
Based on linear control policies, the fairness equation can be stated as follows:
F ( x(t + 1)) = F (x(t )) + (1 − F ( x(t ))) × 1 −
∑ xi2 (t )
∑ (c + x (t ))2
i
where :
c = a/b
a = additive
b = multiplicative
a I ≥ 0, bI ≥ 0,
a D ≥ 0, 0 ≤ bD < 1
As we can see, the reason why AIAD doesn’t converge to fairness is because of the
a D term. It doesn’t satisfy the requirement of being greater than and equal to 0.
b x (t ) if y (t ) = 0 ⇒ Increase
xi (t + 1) = I i
bD xi (t ) if y (t ) = 1 ⇒ Decrease
where :
bI > 1
0 < bD < 1
Although the variables satisfy the non-negative requirement, it does not contain the ‘ a’
term. From our equation on fairness, we know that there is the c term which requires both
‘a’ and ‘ b ’ terms. In MIMD, though the ‘b’ term is bounded, the ‘a’ term is not. Hence,
the MIMD doesn’t change the fairness.
// 10 points each for correct analysis of the AIAD and MIMD //
Problem2.
2. Concepts: RTT estimation
One example:
Lets suppose that the 4 sample RTTs are 1 sec, 6 sec, 20sec, 20 sec. The initial
AverageRTT is 4 sec. The initial deviation is 0. The EWMA is 0.125.
We are using the following formulas:
AverageRTT(k)=AverageRTT(k-1)*(1-x)+x*SampleRTT(k);
Deviation(k)=(1-x)*Deviation(k-1)+x*|SampleRTT(k)-AverageRTT(k-1)|;
Using method1:
Timeout=2*AverageRTT(k);
Using method2:
Timeout=AverageRTT(k)+4*Deviation(k);
Then we have,
Method 1 leads to more spurious timeouts in general than method 2 using a deviation.
That is because method 2 includes the information of how much the sample RTT deviates
from the estimate RTT. Furthermore, method 2 puts more weight on the deviation of the
most recent sample RTT from the estimate RTT, because the most recent sample RTT
have a better reflection about the behaviors of the future connections.
In other words, the estimation methods using only the average can only reflect the static
nature of the samples. Deviation provides information about some additional dynamic
natures of the samples such as the estimation errors. So, method 2 can have a better
performance than method 1 in estimation. //10 points//
Problem 3.
Here we are assuming that the T links are in series between the client and the server.
In this case, a packet will have (T-1) additional transmission times. Therefore,
The time for the server to send out the first segment until it receives the
acknowledgement is:
TS/R+RTT
The latency will have four components:
1. 2 RTT: time to set up the TCP connection and requesting the file;
2. O/R: time to transmit the object at the first link;
3. (T-1)*(S/R), the additional transmission times for the last packet;
4. The sum of the stalled times.
**If students missed the third component and only have component 1,2 and 4 for the
latency, DO NOT DEDUCT ANY POINTS. ***
Let Q be the number of times the server would stall if the object contained an infinite
number of segments.
Then,
RTT
Q=integer part of { log 2 (T + ) } +1
S/R
Define P=min{Q, K-1},
We have
p
The stalled time= ∑ ( RTT + T * ( S / R) − ( S / R) * 2( k −1) )
k =1
Therefore,
p
O
Latency = + 2 * RTT + (T − 1) * ( S / R) + ∑ ( RTT + T * ( S / R) − ( S / R ) * 2( k −1) )
R k =1
O
= + 2 * RTT + (T − 1) * ( S / R) + P * RTT + ( PT − (2 p − 1)) * ( S / R)
R
Equation 3.1
//15 points for equation 3.1, partial credits for correct logic//.
O
Minimum Latency = + 2 * RTT + (T − 1) * ( S / R )
R
Table 3.1 gives the values of both the minimum latency and the latency with slow start.
Latency
O R RTT Minimum P with slow
latency start
100K byte 28K bps 0.1 sec 29.384 0 sec 3 30.90914 sec
100K byte 10M bps 0.1 sec 0.28172 sec 7 0.94227 sec
Problem 1.
(a):
In the period of time from when the connection’s window varies from (W. Mss)/2 to
W.Mss,
W
2
N = ∑ (W + k) //10 points//
k =0
2
(W ) * (W + 1)
= W * (W + 1) + 2 2
2 2 2
3 3
= W2 + W //5 points//
8 4
Therefore,
1
L=
3 2 3
W + W
8 4
(b).
The total time to transmit all the N packets ( including the last one that is lost) is
(W + 1) * RTT . Recall that here we are analyzing the idealized model for the steady-
2
state dynamics of the TCP connection. // 5 points//
Mss
B=
L * (W 2 + 1) * RTT
Mss
=
sqrt ( L) * RTT * sqrt ( L * (W + 1) 2 )
2
Mss
=m*
sqrt ( L) * RTT
where m = 1
sqrt ( L * (W + 1) 2 )
2
1
=
1 2
W +W +1
sqrt ( 4 )
3 2 3
W + W
8 4
If W is large, we can neglect the first and zero order components of W in equation 3.1.
Therefore,
1
m≈
1 2
W
sqrt ( 4 )
3 2
W
8
=1.22 //10 points//
Therefore,
Mss
B ≈ 1.22*
sqrt ( L) * RTT