Divide-and-conquer algorithms often follow a generic pattern: they tackle a problem of size
n by recursively solving, say, a subproblems of size n/b and then combining these answers in
O(nd) time, for some a,b,d>0. Their
running time can therefore be captured by the equation T(n)=aT(⌈n/b⌉)+O(nd). The master theorem
gives a closed-form solution to this general recurrence so that we no longer have to solve it
explicitly in each new instance.
Master theorem If T(n)=aT(⌈n/b⌉)+O(nd) for some constants a>0,b>1, and d≥0,
T(n)={O(nd),if d>logba,O(ndlogn),if d=logba,O(nlogba),if d<logba.Proof.
To prove the claim, let’s start by assuming for the sake of convenience that n is a
power of b. This will not influence the final bound in any important way—after all, n is at
most a multiplicative factor of b away from some power of b—and it will allow
us to ignore the rounding effect in ⌈n/b⌉.
Next, notice that the size of the subproblems decreases by a factor of b with each level
of recursion, and therefore reaches the base case after logbn levels. This is the height of
the recursion tree. Its branching factor is a, so the kth level of the tree is made up of ak
subproblems, each of size n/bk.
The total work done at this level is
ak×O((nbk)d)=O(nd)×(abd)k.
As k goes from 0 (the root) to logbn (the leaves), these numbers form a geometric series with
ratio a/bd. Finding the sum of such a series in big-O notation is easy, and comes
down to three cases.
The ratio is less than 1.
Then the series is decreasing, and its sum is just given by its first term, O(nd).
The ratio is greater than 1.
The series is increasing and its sum is given by its last term, O(nlogba):
nd(abd)logbn=nd(alogbn(blogbn)d)=alogbn=a(logan)(logba)=nlogba.
The ratio is exactly 1.
In this case all O(logn) terms of the series are equal to O(nd).
These cases translate directly into the three contingencies in the theorem statement.