0% found this document useful (0 votes)
45 views

CS / MCS 401 Homework 2 Grader Solutions: Questions From CLRS. Questions Marked With An Asterisk Were Not Graded

The document is a homework assignment with solutions to algorithm analysis problems. It contains: 1) Solutions to four problems showing recurrences and derivations of asymptotic bounds, including showing a recurrence is O(n^2) and another is O(log2(n)). 2) A table indicating asymptotic bounds between pairs of expressions like logarithmic, polynomial, exponential and square root functions. 3) The beginning of a solution using a recursion tree to derive an upper bound on a recurrence defined as T(n) = 3T(n/2) + n.

Uploaded by

tilahun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

CS / MCS 401 Homework 2 Grader Solutions: Questions From CLRS. Questions Marked With An Asterisk Were Not Graded

The document is a homework assignment with solutions to algorithm analysis problems. It contains: 1) Solutions to four problems showing recurrences and derivations of asymptotic bounds, including showing a recurrence is O(n^2) and another is O(log2(n)). 2) A table indicating asymptotic bounds between pairs of expressions like logarithmic, polynomial, exponential and square root functions. 3) The beginning of a solution using a recursion tree to derive an upper bound on a recurrence defined as T(n) = 3T(n/2) + n.

Uploaded by

tilahun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

assignment due June 27, 2016

CS / MCS 401 Homework 2 grader solutions written by Jānis Lazovskis


maximum points: 30

Questions from CLRS. Questions marked with an asterisk ∗ were not graded.

3.1-2 (p.52) Show that for any real constants a and b, where b > 0, (n + a)b = Θ(nb ).

We claim that there exists n0 such that for all n > n0 ,


 b  b
1 b b 3
n 6 (n + a) 6 nb . (1)
2 2

This comes from noting that for all n > 2a,


1 3
n 6 n + a 6 n. (2)
2 2
Indeed, if n = 2a +  for some  > 0, then the above inequality becomes
1 3
a +  6 3a +  6 3a + ,
2 2
which is immediately true if a > 0, and is true for  > |3a| if a < 0. Hence raising all sides of the inequality (2) to a
positive power b (which preserves the directions of the inequality for large enough n, as everything is then positive),
we get inequality (1), as desired. This is also bounded below by 0 for large enough n, therefore (n + a)b is Θ(nb ). 

3.1-4 (p.53) Is 2n+1 = O(2n )? Is 22n = O(2n )?

The first statement is true. Note that 2n+1 = 2 · 2n , so using a constant of c = 2 we see that 2n+1 6 2 · 2n for all n.

The second statement is false. Suppose that 22n = O(2n ), with constant c. Then for all n > n0 for some fixed
n0 , we have that
22n 6 c · 2n = 2log2 (c) · 2n = 2log2 (c)+n .
However, since log2 (c) is finite, there is some N > log2 (c), for which it is immediate that 22N > 2log2 (c)+N , contra-
dicting the above. Therefore the statement is false. 

3.2-2 (p.60) Prove equation (3.16), which states that alogb (c) = clogb (a) .

Use equation (3.15), which states that logx (y) = ln(x)/ ln(y). Applying reversible operations to the equation
above, we reach a true statement, and so the original statement is true. Indeed,

alogb (c) = clogb (a) (given)


logb (c) logb (a)
ln(a ) = ln(c ) (applying ln to both sides)
logb (c) ln(a) = logb (a) ln(c) (laws of logarithms)
ln(c) ln(a) ln(a) ln(c)
= , (equation 3.15)
ln(b) ln(b)

which is true. 

1
3.2-8 (p.60) Show that k ln(k) = Θ(n) implies k = Θ(n/ ln(n)).

We are given that k ln(k) = Θ(n), or that there exist positive constants c1 , c2 such that

0 6 c1 n 6 k ln(k) 6 c2 n. (3)

Dividing this inequality by ln(n), we get

n ln(k) n
0 6 c1 6k 6 c2 . (4)
ln(n) ln(n) ln(n)

Since ln(k)/ ln(n) → 0 as n → ∞, we will keep c1 as the lower bound constant to show that k = Θ(n/ ln(n)) (that
is, replacing ln(k)/ ln(n) with the constant 1 does not change the truth of the middle inequality of (4)).

For the upper bound, use the left side of (3) to note that

ln(n) ln(c1 )
c1 n 6 k ln(k) < k 2 =⇒ ln(c1 ) + ln(n) < 2 ln(k) =⇒ <2− <2
ln(k) ln(k)

for k large enough. Finally, rewrite k and use the right side of (4) to get

ln(k) ln(n) n
k=k < c2 · 2.
ln(n) ln(k) ln(n)

Now we have that


n n
0 6 c1 6 k 6 (2c2 )
ln(n) ln(n)
for n large enough, or in other words, that k is Θ(n/ ln(n)). 

3.2 (p.61) Indicate, for each pair of expressions (A, B) in the table below, whether A is O, o, Ω, ω, or Θ of B.
Assume that k > 1,  > 0, and c > 1 are constants. Your answer should be in the form of the table with “yes” or
“no” written in each box.

The table is given below. Some justification is expected.

A B O o Ω ω Θ
logk2 (n) n yes yes no no no
nk cn yes yes no no no
√ sin(n)
n n no no no no no
n n/2
2 2 no no yes yes no
nlog2 (c) clog2 (n) yes no yes no yes
n
log2 (n!) log2 (n ) yes no yes no yes

Here are some short justification arguments:


a. Any polynomial grows strictly faster than any logarithm.
b. Any exponential grows strictly faster than any polynomial. √
c. The maxima of nsin(n) increase exponentially, yet it has value 1 periodically. However, the function n is
monotonically increasing past 1, so there
√ are no relations among them in terms of growth.
d. Decreasing the base (from 2 to 2 = 21/2 ) of an exponential function makes it grow strictly slower.
e. Question 3.2-2 above shows both functions are the same.
f. Showing O and Ω involves ounding the sum. For Θ, use Stirling’s apporximation on page 57. 

2
4.3-1 (p.87) Show that the solution of T (n) = T (n − 1) + n is O(n2 ).

We find the answer by expressing T (n) in terms of elements in the seqence that come before it. We find that

T (n) = T (n − 1) + n
= T (n − 2) + (n − 1) + n
= T (n − 3) + (n − 2) + (n − 1) + n
..
.
Xn
= T (0) + k
k=1
n(n + 1)
= T (0) +
2
= O(n2 ),

as desired. 

4.3-2 (p.87) Show that the solution of T (n) = T (dn/2e) + 1 is O(log2 (n)).

Similarly to above, we simplfy T (n) to find

T (n) = T (dn/2e) + 1 = T (dn/4e) + 2 = T (dn/8e) + 3 = · · · = T (1) + c.

This terminates at 1 after a certain number of steps, say k. That is, after n/2k 6 1 (because then 1 = dn/2k e =
dn/2k+1 e), or when n 6 2k , or when k 6 log2 (n). Since c is the nmber of steps we’ve taken, it follows that
c = dlog2 (n)e. Hence the solution of T (n) is O(log2 (n)). 

∗ 4.3-6 (p.87) Show that the solution to T (n) = 2T (bn/2c + 17) + n is O(n log2 (n)).

4.4-1 (p.92) Use a recursion tree to determine a good asymptotic upper bound on the recurrence T (n) = 3T (bn/2c)+
n. Use the substitution method to verify your answer.

We construct the tree as described in the book, summing all the elements of each row. We make the assumption
that n is a power of 2 and we ignore the floor function.

Tree Sum of elements in row

n n

n n n 3
2 2 2 2n

n n n n n n n n n 3 2

log2 (n) 4 4 4 4 4 4 4 4 4 2 n

..
.. .. .. .. .. .. .. .. .. .
. . . . . . . . .
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Θ(nlog2 (3) )

The last row has (3/2)log2 (n) n elements, but

3 log2 (n)

2 n = nlog2 (3/2) n = nlog2 (3)−log2 (2)+1 = nlog2 (3) ,

so we get the given result.

3
Taking the sum of the elements of each row, we get
 2  log2 (n)−1
3 3 3
T (n) = n + n + n + ··· + n + Θ(nlog2 (3) )
2 2 2
log2 (n)−1  i
X 3
= n + Θ(nlog2 (3) )
i=0
2
(3/2)log2 (n) − 1
=n· + Θ(nlog2 (3) )
3/2 − 1
 log (n) 
3 2
= 2n − 1 + Θ(nlog2 (3) )
2log2 (n)
2n · nlog2 (3)
= − 2n + Θ(nlog2 (3) )
n
= nlog2 (3) − 2n + Θ(nlog2 (3) )
= O(nlog2 (3) ),

where in the third line we used the formula for a finite geometric series (equation (A.5) on page 1147). This is our
asymptotic upper bound on the recurrence T (n). 

4.4-2 (p.92) Use a recursion tree to determine a good asymptotic upper bound on the recurrence T (n) = T (n/2)+n2 .
Use the substitution method to verify your answer.

This is the tree corresponding to the recurrence:

Tree Sum of elements in row

n2 n2

n2 n 2

4 2

n2 n 2

log2 (n) 16 4

..
.. .
.
1 Θ(1)
2
n n2
The last line, by the pattern of the sum of elements in each row, should have 22 log 2 (n)
= n2 = 1 element. Taking the
sum of the elements of each row, we get
 n 2  n 2  n 2
T (n) = n2 + + + · · · + log (n)−1 + Θ(1)
2 4 2 2
log2 (n)−1 2
X n
= + Θ(1)
i=0
22i
(1/4)log2 (n) − 1
= n2 · + Θ(1)
1/4 − 1
 
1
= −3n2 − 1 + Θ(1)
n2
= 3n2 − 3 + Θ(1)
= O(n2 ).

Note the constant −3 is absorbed into Θ(1), giving us an asymptotic upper bound of O(n2 ). Note that this is actually
an asymptotically tight bound, so we could have written Θ(n2 ) as well. 

4
∗ 4.4-4 (p.93) Use a recursion tree to determine a good asymptotic upper bound on the recurrence T (n) =
2T (n − 1) + 1.

∗ 4.4-5 (p.93) Use a recursion tree to determine a good asymptotic upper bound on the recurrence T (n) =
T (n − 1) + T (n/2) + n.

4.4-8 (p.93) Use a recursion tree to give an asymptotically tight solution to the recurrence T (n) = T (n − a) +
T (a) + cn, where a > 1 and c > 0 are constants.

This is the tree corresponding to the recurrence:

Tree Sum of elements in row

cn cn

c(n − a) ca cn

c(n − 2a) ca T (0) ca cn + Θ(1)


n/a
c(n − 3a) ca T (0) ca T (0) ca cn + Θ(1)

c(n − 4a) ca T (0) ca T (0) ca T (0) ca cn + Θ(1)


.. .. .. .. .. ..
. . . . . .
T (0) T (0) T (0) T (0) T (0) cn + Θ(1)

Taking the sum of the elements of each row, we get

T (n) = cn + cn + cn + Θ(1) + · · · + cn + Θ(1)


= (n/a) · cn + Θ(1)
= (c/a)n2 + Θ(1)
= Θ(n2 ),

which is asymptotically tight because we did not introduce any slopiness into the calculations. 

∗ 4.4-9 (p.93) Use a recursion tree to give an asymptotically tight solution to the recurrence T (n) = T (αn) +
T ((1 − α)n) + cn, where α is a constant in the range 0 < α < 1 and c > 0 is also a constant.

4.5-1 (p.96) Use the master method to give tight asymptotic bounds for the following recurrences.

For all these a = 2 and b = 4, and log4 (2) = 1/2.


a. T (n) = 2T (n/4) + 1.
Since 1 is a constant, we can only say it is O(n1/2− ), so case 1 applies, and T (n) = Θ(n1/2 ).

b. T (n)
√ = 2T (n/4) + n.
Since n = Θ(n ), case 2 applies, and T (n) = Θ(n1/2 log2 (n)).
1/2

c. T (n) = 2T (n/4) + n.
Since n = Ω(n1/2+ ) and 2 · n/4 = 2n 6 3n, where 3 > 1 is certainly a constant, case 3 applies, and T (n) = Θ(n).

d. T (n) = 2T (n/4) + n2 .
Since n = Ω(n1/2+ ) and 2 · (n/4)2 = n2 /8 6 2n2 , where 2 > 1 is certainly a constant, case 3 applies, and
T (n) = Θ(n2 ). 

5
4.5-3 (p.97) Use the master method to show that the solution to the binary-search recurrence T (n) = T (n/2) +Θ(1)
is T (n) = Θ(log2 (n)).

Here a = 1 and b = 2, so log2 (1) = 0. Case 2 applies, because Θ(nlog2 (1) ) = Θ(n0 ) = Θ(1), which is exactly the
f (n) term. Hence by the master method, T (n) = Θ(nlog2 (1) log2 (n)) = Θ(log2 (n)), as desired. 

4.5-4 (p.97) Can the master method be applied to the recurrence T (n) = 4T (n/2) + n2 log2 (n)? Why or why not?
Give an asymptotic upper bound for this reccurence.

Here a = 4 and b = 2, so log2 (4) = 2. Case 1 does not apply, because n2 log2 (n) is not bounded above by n2−
(that is, it is not O(n2− ). Case 2 does not apply, because it is not bound above or below by n2 (that is, it is not
Θ(n2 )). Finally, case 3 also does not apply, because although n2 log2 (n) = Ω(n2 ), it is not Ω(n2+ ), because any
positive power of n eventually grows faster than log2 (n). Hence the master method can not be applied.

An asymptotic upper bound of O(n2 log2 (n)) may be found via the substitution or recurrence tree method. 

You might also like