Big-O and Recursion
We have some practice identifying the Big-O of iterative algorithms. However, now that we're (again) experts on recursion, it's important that we be able to identify the Big-O of recursive algorithms. We're not going to spend a huge amount of time on this; that's Algorithms' job (note for IT people: consider taking Algorithms. It's an Important Class). However, it is important that you can identify some of the most common patterns.
Let's start with an example: linked-list traversal.
traverse(Node n) {
if (n==null)
return;
//Do some stuff to this node
traverse(n.getNext());
}
Obviously, we have two cases; a base case, and a recursive case. We're going to write a function \(T\)that describes the amount of work that needs to be done in each case, as a function of \(n\). The base case is easy:
\(T(0)=O(1)\)
This is because when \(n=0\), there is a constant amount of work to do (namely, do the comparison in the if-statement, and perform the return statement).
What if \(n\neq 0\)?
\(T(n) = O(1) + T(n-1)\)
This is because when \(n\neq 0\), we do some constant amount of work (if comparison, function call to traverse, etc), then run it all again on \(n-1\).
The two of these functions together are known as a recurrence relation. So the recurrence relation of this algorithm is:
\(T(0)=O(1)\)
\(T(n)=O(1)+T(n-1)\)
Solving Recurrence Relations
Our five-step process for solving a recurrence relation is:
- Write down the recurrence relation.
- Expand the recurrence relation, for several lines.
- Figure out what you would write for the i-th line.
- Figure out what value of i would result in the base case.
- In your equation from 3, replace i with the solution from 4.
OK, so we've written it. Let's do step 2.
\(T(n) = O(1) + T(n-1)\)
\(T(n) = O(1) + \underbrace{O(1) + T(n-2)}_{T(n-1)}\)
\(T(n) = O(1) + O(1) + \underbrace{O(1) + T(n-3)}_{T(n-2)}\)
And so on. But this is probably enough to recognize the pattern. Let's do step 3. Hopefully you agree that on the i-th line, we'd have:
\(T(n) = i*O(1) + T(n-i)\)
So when does this stop? Well, that's where the other part of our recurrence relation comes in: our base case equation tells us that when \(n=0\), it's just \(O(1)\). So, we figure out what value of i makes the T(n-i) equal to our base case of T(0). In this case, that happens when \(i=n\), because when \(i=n, n-i=0\).
Finally, we do step five, and take our equation \(T(n) = i*O(1) + T(n-i)\), and replace i with our solution of step 4 (\(i=n\)), to get \(T(n) = n*O(1) + T(n-n)\). Because T(0)=O(1), the runtime of that equation is O(n).
Another Example!
A couple days ago we saw the iterative version of binary search. Here's the recursive version:
boolean binarySearch(int[] a, int beginning, int end, int daInt) {
if (a.length == 0)
return false;
if (end - beginning == 1)
return a[beginning]==daInt;
int middle = beginning + (end-beginning)/2;
if (a[middle]>daInt)
return binarySearch(a, beginning, middle, daInt);
return binarySearch(a, middle, end, daInt);
}
The recurrence relation to this has two base cases, and one recursive case:
\(T(0) = O(1)\)
\(T(1) = O(1)\)
\(T(n) = O(1) + T(n/2)\)
So let's figure out the Big-O:
\(T(n) = O(1) + T(n/2)\)
\(T(n) = O(1) + \underbrace{O(1) + T(n/4)}_{T(n/2)}\)
\(T(n) = O(1) + O(1) + \underbrace{O(1) + T(n/8)}_{T(n/4)}\)
\(T(n) = i * O(1) + T(n/(2^i))\)
\(n/(2^i) = 1\) when \(i=log_2(n)\).
\(T(n) = log_2(n) * O(1) + T(1)\)
\(T(n) = O(log(n))\)
Hopefully, before the last line, it was clear to you at a minimum that this was sublinear, that is, faster than linear time.