Master's Theorem for Solving Recurrence Relation



Recurrence relations are widely used in discrete mathematics to describe the time complexity of algorithms, mostly recursive algorithms. However, as sequences become more complex, solving recurrence relations by substitution or iteration methods can get challenging. For such complex sequences, we need to use the Master's Theorem.

Master's Theorem works on specific types of recurrence relations. It is a powerful tool to determine the asymptotic behavior of recurrence relations. It follows a specific form, which is to estimate the time complexity without going through a lengthy, step-by-step expansion.

In this chapter, we will explain the fundamentals of Master's Theorem, the three cases of Master's Theorem, and how to apply it in practice.

Understanding the Master's Theorem

Master's Theorem applies to recurrence relations of the form −

$$\mathrm{T(n) = aT\left(\frac{n}{b}\right) + f(n)}$$

where −

  • T(n) is the time complexity we want to find.
  • a ≥ 1 and b > 1 are constants.
  • f(n) is a function of n that adds an extra term to the recurrence.

Conditions for Using Masters Theorem

The theorem applies specifically to recurrence relations where the following conditions met:

  • The problem is divided into a number of sub-problems.
  • Each sub-problem has a size of n / b.
  • There is an additional term f(n) representing non-recursive work, such as linear or logarithmic computations outside the recursive calls.

The master's theorem offers three special cases for solving, based on how f(n) compares to $\mathrm{n^{\log_b a}}$, a term derived from a and b in the recurrence relation. Each case provides a formula for the asymptotic complexity of T(n).

The Three Cases of Masters Theorem

Let us now understand the three cases of Master's Theorem.

Case 1: $\mathrm{f(n) \:=\: O\left(n^{\log_b a \:-\: \epsilon}\right)}$

In this case, if f(n) grows slower than $\mathrm{n^{\log_b a}}$, (specifically by a factor of $\mathrm{n^{\log_b a \:-\: \epsilon}}$, where $\mathrm{\epsilon \:>\: 0}$), then −

$$\mathrm{T(n) \:=\: \Theta\left(n^{\log_b a}\right)}$$

This case often applies when f(n) is a polynomial term with a lower degree than $\mathrm{n^{\log_b a}}$.

Case 2: $\mathrm{f(n) \:=\: \Theta\left(n^{\log_b a}\right)}$

In this case, if f(n) grows at the same rate as $\mathrm{n^{\log_b a}}$, then −

$$\mathrm{T(n) \:=\: \Theta\left(n^{\log_b a} \:\cdot\: \log n\right)}$$

This case brings a logarithmic factor because f(n) matches the growth rate of $$\mathrm{n^{\log_b a}}$$.

Case 3: f(n) = ((n^(log_ba+))

$$\mathrm{f(n) \:=\: \Omega\left(n^{\log_b a \:+\: \epsilon}\right)}$$

In the third case, f(n)grows faster than $\mathrm{n^{\log_b a}}$, so it means it dominates the recurrence relation. Here, we need an additional condition to get that the faster-growing term truly takes over the recurrence −

Regularity Condition

$$\mathrm{a f\left(\frac{n}{b}\right) \:\leq\: c f(n),\: \text{ where } \:c \:<\: 1}$$

When it holds, we have −

$$\mathrm{T(n) \:=\: \Theta(f(n))}$$

Example 1: Using Masters Theorem for a Simple Recurrence

Let us understand masters theorem with the following example −

$$\mathrm{T(n) \:=\: 2T\left(\frac{n}{4}\right) \:+\: \sqrt{n}}$$

Step 1: Identify a, b, and f(n)

Here, a = 2 and b = 4

$$\mathrm{f(n) \:=\: \sqrt{n} \:=\: n^{\frac{1}{2}}}$$

Step 2: Find $\mathrm{n^{\log_b a}}$

Calculate $\mathrm{n^{\log_b a}}$ Here,

$$\mathrm{\log_b a \:=\: \log_4 2}$$

We can rewrite 4 as 22, so −

$$\mathrm{\log_4 2 \:=\: \frac{1}{2}}$$

Hence,

$$\mathrm{n^{\log_b a}\: =\: n^{\frac{1}{2}}}$$

Step 3: Compare f(n) with $\mathrm{n^{\log_b a}}$

Since f(n) = n0.5, which matches, we fall into Case 2.

Step 4: Apply the Masters Theorem Formula for Case 2

According to Case 2, if $\mathrm{f(n) \:=\: \Theta\left(n^{\log_b a}\right)}$, then −

$$\mathrm{T(n) \:=\: \Theta\left(n^{\log_b a} \:\cdot\: \log n\right)}$$

Thus, the solution is −

$$\mathrm{T(n) \:=\: \Theta\left(\sqrt{n} \:\cdot\: \log n\right)}$$

This is the asymptotic time complexity of the recurrence relation.

Example 2: A More Complex Recurrence Relation

Consider a recurrence relation where −

$$\mathrm{T(n) \:=\: 2T(\sqrt{n}) \:+\: \log n}$$

Step 1: Rewrite the Recurrence in Terms of Masters Theorem

The Master’s theorem requires the problem size to be divided by b in each recursion. We need to adjust n. Let m = log n. Then n = 2m, and we can rewrite the recurrence in terms of m instead of n.

Step 2: Transform the Equation

Substitute T(n) with S(m) = T(2m)

$$\mathrm{S(m) \:=\: 2S\left(\frac{m}{2}\right) \:+\: m}$$

Now we have a recurrence relation in the form $\mathrm{S(m) = aS\left(\frac{m}{b}\right) + f(m)}$ with a = 2, b = 2, and f(m) = m.

Step 3: Find $\mathrm{m^{\log_b a}}$

Calculate $\mathrm{m^{\log_b a}}$,

$$\mathrm{m^{\log_b a} \:=\: m}$$

So $\mathrm{f(m) \:=\: m \:=\: m^{\log_b a}}$, which means we are in Case 2.

Step 4: Apply the Formula for Case 2

Using Case 2 of Masters Theorem, we get −

$$\mathrm{S(m) \:=\: \Theta(m \:\cdot\: \log m)}$$

Since m = log n, substitute back −

$$\mathrm{T(n) \:=\: \Theta(\log n \:\cdot\: \log (\log n))}$$

So, the time complexity for the original recurrence relation is −

$$\mathrm{T(n) \:=\: \Theta(\log n \:\cdot\: \log (\log n))}$$

Using Masters Theorem for Algorithm Analysis

We have understood that the Master's theorem is particularly useful in analyzing divide and conquer algorithms. Consider algorithms like Merge Sort or Binary Search −

  • In Merge Sort, the recurrence is $\mathrm{T(n) \:=\: 2T\left(\frac{n}{2}\right) \:+\: O(n)}$. This recurrence falls into Case 2, with a time complexity of Θ(n log n).
  • In Binary Search, the recurrence $\mathrm{T(n) \:=\: T\left(\frac{n}{2}\right) \:+\: O(1)}$ falls under Case 1, with a complexity of Θ(log n).

How to Apply the Masters Theorem

There is a set of thumb rules. They are as follows −

  • Identify a, b, and f(n) Carefully − These constants set up which case applies.
  • Calculate $\mathrm{n^{\log_b a}}$ Accurately − Use logarithmic properties to simplify.
  • Match f(n) with $\mathrm{n^{\log_b a}}$ − Use the three cases to find where f(n) fits relative to $\mathrm{n^{\log_b a}}$.
  • Check Conditions for Case 3 − Ensure the regularity condition holds if applying Case 3.

Conclusion

Masters Theorem helps in solving complex recurrence relations without any tedious iterations. We reviewed the three cases of Masters Theorem, learning how to match the function f(n) with the derived term $\mathrm{n^{\log_b a}}$.

By applying the theorem in practical examples, we saw how the theorem helps in finding the time complexities. With Masters Theorem in our toolkit, we can handle many recursive problems in discrete mathematics and computer science.

Advertisements