Learn how to analyze recursive algorithms using recurrence relations, substitution, iteration, and Master’s Theorem with examples like factorial, binary search, and merge sort.
Analysis of Recursive Algorithms:
Recursion is a technique in which a function calls itself repeatedly until a stopping condition, called the base case, is reached. When analyzing recursive algorithms, the main concern is how the time complexity grows with input size.
To analyze recursive algorithms, we use a mathematical expression known as a Recurrence Relation.
What is a Recursive Algorithm?
A recursive algorithm is one in which a function calls itself to solve smaller instances of the same problem.
Examples of Recursive Algorithms
- Factorial (n!)
- Fibonacci Series
- Binary Search
- Merge Sort
- Quick Sort
Components of a Recursive Algorithm
(i) Base Case
- The condition at which recursion stops.
- Without a base case, recursion continues indefinitely, leading to stack overflow or program crash.
(ii) Recursive Case
- The part where the function calls itself on smaller subproblems.
- Each recursive call reduces the problem size.
Recurrence Relation
A recurrence relation expresses the running time T(n) of a recursive algorithm in terms of the running time of smaller inputs.
General Examples
- T(n) = T(n − 1) + c
(e.g., Factorial) - T(n) = 2T(n/2) + n
(e.g., Merge Sort)
The recurrence relation varies depending on how the problem is divided.
Recurrence relations are used to determine the time complexity of recursive algorithms. Methods for Analyzing Recursive Algorithms
There are three standard methods for solving recurrence relations:
(1) Substitution Method
In this method:
- We guess the form of the solution for T(n).
- Then prove it using mathematical induction.
Example:
T(n) = T(n − 1) + 1
Solution:
T(n) = O(n)
(2) Iteration / Expansion Method
In this method:
- The recurrence relation is expanded step by step.
- A pattern is observed to determine complexity.
Example:
T(n) = 2T(n/2) + n
After expansion:
T(n) = O(n log n)
(3) Master’s Theorem
Master’s Theorem is widely used for Divide and Conquer algorithms.
Standard Form
T(n) = aT(n/b) + f(n)
Where:
- a = number of subproblems
- b = factor by which problem size is reduced
- f(n) = work done outside recursion
This theorem directly gives the time complexity.
Used for algorithms such as:
- Merge Sort
- Quick Sort
- Binary Search
Examples of Recursive Algorithm Analysis
(A) Factorial
Algorithm:
factorial(n){ if (n == 0) return 1; else return n * factorial(n − 1);}
Recurrence Relation:
T(n) = T(n − 1) + 1
Time Complexity:
T(n) = O(n)
(B) Binary Search
Binary search reduces the problem size by half in each step.
Recurrence Relation:
T(n) = T(n/2) + 1
Time Complexity:
T(n) = O(log n)
(C) Merge Sort
Merge sort divides the array into two halves.
Recurrence Relation:
T(n) = 2T(n/2) + n
Using Master’s Theorem:
T(n) = O(n log n)
(D) Fibonacci (Naive Recursive)
Recurrence Relation:
T(n) = T(n − 1) + T(n − 2) + 1
Time Complexity:
T(n) = O(2ⁿ)
(Exponential Time)
Why Is Analysis of Recursive Algorithms Important?
- Helps understand when recursion is efficient
- Identifies time and space requirements
- Determines when iteration is better than recursion
- Helps optimize algorithms
Summary
To analyze a recursive algorithm:
- Identify the base case and recursive case
- Write the recurrence relation
- Solve it using Substitution, Iteration, or Master’s Theorem
- Determine the final time complexity
- POP- Introduction to Programming Using ‘C’
- DS – Data structure Using C
- OOP – Object Oriented Programming
- Java Programming
- DBMS – Database Management System
- RDBMS – Relational Database Management System
- https://defineinfoloop.blogspot.com/?m=1