Learn Master’s Theorem to analyze divide and conquer algorithms. Understand recurrence relations, all three cases, solved examples, and time complexity.

Master’s Theorem

Master’s Theorem is an important method used to analyze the time complexity of recursive algorithms that follow the Divide and Conquer approach.

When a problem of size n is divided into smaller subproblems and solved recursively, the following recurrence relation is formed.

Standard Recurrence Form

T(n) = aT(n / b) + f(n)

Where:

  • a = number of subproblems
  • b = factor by which the problem size is reduced
  • f(n) = work done outside recursion (divide, combine, merge, etc.)

Master’s Theorem helps us directly determine the asymptotic time complexity of such recurrences.

Key Comparison

To apply Master’s Theorem:

  1. Compute k = log_b a
  2. Compare f(n) with nᵏ

Based on this comparison, there are three cases.

Case 1: f(n) is Polynomially Smaller than nᵏ

If:

f(n) = O(n^(k − ε)),  where ε > 0

Then:

T(n) = Θ(nᵏ) = Θ(n^(log_b a))

Interpretation:
The total time is dominated by recursive calls; external work is insignificant.

Case 2: f(n) Matches nᵏ (Same Order)

If:

f(n) = Θ(nᵏ log^p n),  where p ≥ 0

Then:

T(n) = Θ(nᵏ log^(p+1) n)

Interpretation:
Both recursive work and external work contribute equally.

Case 3: f(n) is Polynomially Larger than nᵏ

If:

f(n) = Ω(n^(k + ε)),  where ε > 0

and the regularity condition holds:

a f(n / b) ≤ c f(n),  for some constant c < 1

Then:

T(n) = Θ(f(n))

Interpretation:
The external (non-recursive) work dominates the total running time.

Master’s Theorem – Examples

Example 1: Merge Sort

Recurrence:

T(n) = 2T(n/2) + n

Values:

  • a = 2
  • b = 2
  • f(n) = n
  • k = log₂ 2 = 1

Comparison:

f(n) = n = nᵏ → Case 2

 Time Complexity:

T(n) = Θ(n log n)

Example 2: Binary Search

Recurrence:

T(n) = T(n/2) + 1

Values:

  • a = 1
  • b = 2
  • f(n) = 1
  • k = log₂ 1 = 0

Comparison:

f(n) = n⁰ → Case 2 (p = 0)

Time Complexity:

T(n) = Θ(log n)

Example 3: Tree Traversal

Recurrence:

T(n) = 2T(n/2) + 1

Values:

  • a = 2
  • b = 2
  • k = log₂ 2 = 1
  • f(n) = 1 = n⁰

Comparison:

f(n) < nᵏ → Case 1

Time Complexity:

T(n) = Θ(n)

Example 4: Exponential Growth

Recurrence:

T(n) = 4T(n/2) + n²

Values:

  • a = 4
  • b = 2
  • k = log₂ 4 = 2
  • f(n) = n²

Comparison:

f(n) = nᵏ → Case 2

Time Complexity:

T(n) = Θ(n² log n)

Applications of Master’s Theorem

  • Analysis of Divide and Conquer algorithms
    • Merge Sort
    • Quick Sort
    • Binary Search
    • Strassen’s Matrix Multiplication
    • Tree-based algorithms
  • Quick determination of time complexity
  • Important for:
    • Competitive Programming
    • GATE
    • BCA / MCA / B.Tech

Easy Summary Table

Case Condition Time Complexity
1 f(n) < n^(log_b a) Θ(n^(log_b a))
2 f(n) = n^(log_b a) Θ(n^(log_b a) log n)
3 f(n) > n^(log_b a) Θ(f(n))
Some More: 

Leave a Reply

Your email address will not be published. Required fields are marked *