As noted in Chapter 1, when an algorithm contains a recursive call to itself, its running time can often be described by a recurrence. A **recurrence**** **is an equation or inequality that describes a function in terms of its value on smaller inputs. For example, we saw in Chapter 1 that the worst-case running time *T*(*n*) of the MERGE-SORT procedure could be described by the recurrence

whose solution was claimed to be *T*(*n*) = (*n* lg *n*).

T(n) =aT(n/b) + (n),

T(n) = 2T(n/2) + (n),

The substitution method for solving recurrences involves guessing the form of the solution and then using mathematical induction to find the constants and show that the solution works. The name comes from the substitution of the guessed answer for the function when the inductive hypothesis is applied to smaller values. This method is powerful, but it obviously can be applied only in cases when it is easy to guess the form of the answer.

T(n) = 2T(n/2) +n,

T(n) 2(cn/2 1g (n/2)) +n

cnlg(n/2) +n

=cnlgn - cnlg 2 +n

=cnlgn - cn+n

cnlgn,

where the last step holds as long as *c* 1.

T(n) = 2T(n/2 + 17) +n,

T(n) =T(n/2) +T(n/2) + 1.

T(n)cn/2 +cn/2 + 1

=cn+ 1,

T(n) (cn/2 -b) + (cn/2 -b) + 1

=cn- 2b+ 1

cn-b,

T(n) 2(cn/2) +n

cn+n

=O(n),wrong!!

T(2) = 2^{m}T(2/2) +^{m}m.

We can now rename *S*(*m*) = *T*(2* ^{m}*) to produce the new recurrence

S() = 2^{m}S(/2) +^{m}m,

Show that the solution of *T*(*n*) = *T*(*n*/2) + 1 is *O*(lg *n*).

Show that the solution of* T*(*n*) = 2*T*(*n*/2) + *n* is (*n* lg *n*). Conclude that the solution is (*n* lg *n*).

Show that (*n* lg *n*) is the solution to the "exact" recurrence (4.2) for merge sort.

Show that the solution to *T*(*n*) = 2*T*(*n*/2 + 17) + *n* is *O*(*n* lg *n*).

The method of iterating a recurrence doesn't require us to guess the answer, but it may require more algebra than the substitution method. The idea is to expand (iterate) the recurrence and express it as a summation of terms dependent only on *n* and the initial conditions. Techniques for evaluating summations can then be used to provide bounds on the solution.

As an example, consider the recurrence

T(n) = 3T(n/4) +n.

T(n) =n+ 3T(n/4)

=n+ 3 (n/4 + 3T(n/16))

=n+ 3(n/4 + 3(n/16 + 3T(n/64)))

=n+ 3n/4 + 9n/16 + 27T(n/64),

where *n*/4/4 = *n*/16 and *n*/16/4 = *n*/64 follow from the identity (2.4).

The master method provides a "cookbook" method for solving recurrences of the form

T(n) =aT(n/b)+â(n),

The master method depends on the following theorem.

T(n) =aT(n/b) + â(n),

1. If â(*n*) = *O*(*n*^{log}* _{b}^{a}*-

2. If â(*n*) = (*n*^{log}* _{b}^{a}*), then

T(n) = 9T(n/3) +n.

T(n) =T(2n/3) + 1,

T(n) = 3T(n/4) +nlgn,

The master method does not apply to the recurrence

T(n) = 2T(n/2) = +n1gn,

Use the master method to give tight asymptotic bounds for the following recurrences.

This section contains a proof of the master theorem (Theorem 4.1) for more advanced readers. The proof need not be understood in order to apply the theorem.

The first part of the proof of the master theorem analyzes the master recurrence (4.5),

T(n) =aT(n/b) +â(n),

where *i* is a positive integer. Then

* Proof *Iterating the recurrence yields

T(n) = â(n) +aT(n/b)

= â(n) +aâ(n/b) +a^{2}T(n/b^{2})

= â(n) +aâ(n/b) +a^{2}â(n/b^{2}) + . . .

+a^{log}-1_{b}^{n}f(n/b^{log}-1) +_{b}^{n}a^{log}(1) ._{b}^{n}T

Since *a*^{log}* _{b }^{n} *=

a^{log}_{b}^{n}T(1) = (n^{log});_{b}^{a}

using the boundary condition *T*(1) = (1). The remaining terms can be expressed as the sum

In terms of the recursion tree, the three cases of the master theorem correspond to cases in which the total cost of the tree is (1) dominated by the costs in the leaves, (2) evenly distributed across the levels of the tree, or (3) dominated by the cost of the root.

can then be bounded asymptotically for exact powers of *b* as follows.

1. If â(*n*) = *O*(*n*^{log}* _{b}^{a-}*) for some constant > 0, then

2. If â(*n*) = (*n*^{log}_{b}^{a}_{), }then *g*(*n*)* *= _{(}*n*^{log}_{b}^{a}_{ }1g *n*_{).}

3. If *a*â(*n/b*) câ(*n*) for some constant *c* < 1 and all *n* *b*, then *g*(*n*) = (â(*n*)).

g(n) =O(n^{log}),_{b}^{a}

Substituting this expression for the summation in equation (4.9) yields

g(n) = (n^{log}log_{b}^{a})_{b}n

= (n^{log}1g_{b}^{a}n),

We can now prove a version of the master theorem for the case in which *n* is an exact power of *b*.

1. If* *(*n*) = *O*(*n*^{log}* _{b}^{a-}*) for some constant > 0, then

2. If (*n*) = (*n*^{log}* _{b}^{a}*), then

T(n) = (n^{log}) +_{b}^{a}O(n^{log})_{b}^{a}

= (n^{log}),_{b}^{a}

T(n) = (n^{log}) + (_{b}^{a}n^{log}1g_{b}^{a}n)

= (n^{log}lg_{b}^{a }n).

T(n) = (n^{log}) + ((_{b}^{a}n))

= ((n)).

T(n) =aT(n/b) + (n)

T(n) = aT(n/b) + (n)

n,

n/b,

n/b/b,

n/b/b/b,

Let us denote the *i*th element in the sequence by *n _{i}*, where

and thus, when *i* = log_{b} n*, we obtain *n_{i}* *b* + *b */ (*b* - 1) = *O*(1).*

We can now iterate recurrence (4.10), obtaining

We can now evaluate the summation

4-2 Finding the missing integer

Show that if we use only this operation, we can still determine the missing integer in *O*(*n*) time.

Throughout this book, we assume that parameter passing during procedure calls takes constant time, even if an *N*-element array is being passed. This assumption is valid in most systems because a pointer to the array is passed, not the array itself. This problem examines the implications of three parameter-passing strategies:

1. An array is passed by pointer. Time = (1).

2. An array is passed by copying. Time = (*N*), where *N* is the size of the array.

* b.* Redo part (a) for the MERGE-SORT algorithm from Section 1.3.1.

**b.***T*(*n*) = 3*T*(*n*/3 + 5) + *n */ 2.

Often, we are able to bound a recurrence *T*(*n*) at exact powers of an integral constant *b*. This problem gives sufficient conditions for us to extend the bound to all real *n* > 0.

This problem develops properties of the Fibonacci numbers, which are defined by recurrence (2.13). We shall use the technique of generating functions to solve the Fibonacci recurrence. Define the **generating function**** **(or * formal power series*) as

* d. *Prove that , rounded to the nearest integer.

* e.* Prove that

Chipsays ChipAsays ConclusionB

--------------------------------------------------------

Bis goodAis good both are good, or both are bad

Bis goodAis bad at least one is bad

Bis badAis good at least one is bad

Bis badAis bad at least one is bad

^{1} VLSI stands for "very-large-scale integration," which is the integrated-circuit chip technology used to fabricate most microprocessors today.