Notes/Advanced Algorithms/Recurrence relations.md
2024-12-07 21:07:38 +01:00

4.2 KiB

type
mixed

In algorithmics, a recurrence relation is often used to describe the time complexity of recursive algorithms. It expresses how the running time of an algorithm depends on the size of the input and the cost of recursive calls.

  • Example: Consider the merge sort algorithm, which divides the input of size n into two halves, recursively sorts each half, and then merges the two sorted halves. Its recurrence relation is:
    
    T(n) = 2T\left(\frac{n}{2}\right) + O(n),
    
    where:
    • 2T\left(\frac{n}{2}\right) represents the two recursive calls on halves of the input.
    • O(n) represents the time to merge the two halves.

Solving Recurrence Relations for Time Complexity

To determine the time complexity of an algorithm, we solve the recurrence relation to find an explicit formula for T(n).

1. Backtracking

This involves repeatedly substituting the recurrence relation into itself until a pattern emerges.

  • Example: For the recurrence T(n) = 2T\left(\frac{n}{2}\right) + O(n):
    • First substitution:
      
      T(n) = 2\left[2T\left(\frac{n}{4}\right) + O\left(\frac{n}{2}\right)\right] + O(n)
      = 4T\left(\frac{n}{4}\right) + 2O\left(\frac{n}{2}\right) + O(n).
      
    • Second substitution:
      
      T(n) = 8T\left(\frac{n}{8}\right) + 4O\left(\frac{n}{4}\right) + 2O\left(\frac{n}{2}\right) + O(n).
      
    • General pattern:
      
      T(n) = 2^k T\left(\frac{n}{2^k}\right) + O\left(n \log n\right).
      
    • When k = \log n, the base case T(1) is reached, so:
      
      T(n) = O(n \log n).
      

2. Master Theorem

The Master Theorem provides a direct way to analyze recurrence relations of the form:


T(n) = aT\left(\frac{n}{b}\right) + O(n^d),

where:

  • a is the number of recursive calls,
  • b is the factor by which the input size is divided,
  • d is the exponent of the non-recursive work. 1

The solution depends on the comparison of a with b^d:

  1. Case 1 (a < b^d): The work outside recursion dominates.

    
    T(n) = O(n^d).
    
  2. Case 2 (a = b^d): The work is evenly distributed.

    
    T(n) = O(n^d \log n).
    
  3. Case 3 (a > b^d): The recursion dominates.

    
    T(n) = O(n^{\log_b a}).
    
  • Example: For T(n) = 2T\left(\frac{n}{2}\right) + O(n):
    • a = 2, b = 2, d = 1.
    • Compare a with b^d: 2 = 2^1, so it falls under Case 2.
    • Solution:
      
      T(n) = O(n \log n).
      

3. Substitution Method

This involves guessing a solution and proving it correct using mathematical induction.

  • Example: For T(n) = 2T\left(\frac{n}{2}\right) + O(n), guess T(n) = O(n \log n):
    1. Assume T(k) = O(k \log k) for k < n (inductive hypothesis).
    2. Substitute into the recurrence:
      
      T(n) = 2T\left(\frac{n}{2}\right) + O(n).
      
      Using the hypothesis:
      
      T\left(\frac{n}{2}\right) = O\left(\frac{n}{2} \log \frac{n}{2}\right) = O\left(\frac{n}{2} (\log n - 1)\right).
      
      Simplify:
      
      T(n) = 2 \cdot O\left(\frac{n}{2} \log n\right) + O(n) = O(n \log n).
      
    3. Conclusion: The guess is correct.

Applications in Algorithm Design

  1. Divide and Conquer: Many divide-and-conquer algorithms, like merge sort, quicksort, and binary search, have their time complexity described by recurrence relations.

  2. Dynamic Programming: Recurrences are also used to describe subproblem dependencies in dynamic programming algorithms.

  3. Graph Algorithms: Recurrences appear in graph traversal techniques and optimization algorithms (e.g., shortest paths).


Tips for Mastering Recurrence Relations

  1. Understand the nature of recursion in the algorithm (e.g., how input is divided, base cases, etc.).
  2. Identify the dominant term in the recurrence to estimate growth.
  3. Use tools like backtracking, the Master Theorem, or substitution to solve recurrences efficiently.
  4. Practice interpreting recurrences in terms of algorithm behavior.


  1. The Master Theorem simplifies solving divide-and-conquer recurrences by comparing the relative growth of recursion and non-recursive work. ↩︎