Reading

These notes and closely review Unit 2 Section 2 and Section 3.

Homework

Homework

Another merge improvement

Midn X suggested an improvement to mergesort that he hypothesized would make its best case $\Theta(\lg n)$. The improvement is to add
if A[m] ≤ A[m+1] return
to the top of "merge". When the original input is already sorted, this will make all the merges take constant time. This gave us a great opportunity to analyze a recursive algorithm. We also discussed that this hypothesis is impossible, because any correct sorting algorithm must at least examine every element in the array, so any sorting algorithm has a best case running time that is $\Omega(n)$. That's pretty cool, since we learned something about any algorithm anyone could ever devise.

Analyzing MergeSort — a review

Recall that in the previous class we decided that the worst-case time for mergesort is given by the following recurrence relation:

$ T(n) \leq c n + T\left( \lceil n/2 \rceil \right) + T\left( \lfloor n/2 \rfloor \right) $

We also showed how (with the help of Mathematica) to determine that this recurrence is $O(n \lg n)$.

Normally, our next step would be to analyze things to determine a lower-bound on the worst case running time. However, I decided we'd do something more ambitious, that is ...

Prove that any comparison-based sorting algorithm has $T_w(n) \in \Omega(n \lg n)$

By "prove that any comparison-based sorting algorithm has $T_w(n) \in \Omega(n \lg n)$", I mean that any algorithm, including any super-clever advanced algorithm produced by future generations. That's pretty ambitious, no? How can I prove what future generations can't do?

One of the things that makes analyzing algorithms difficult, is that they are dynamic. An algorithm's state changes over the period of time in which it runs. However, it is possible with these sorting algorithms to talk about the algorithm as a static object, separated from any particular run of the algorithm. We saw how with an example. I showed in class how to construct the tree $T_3$ showing all possible ways insertion sort can run on a three element array.

One very interesting feature of this tree is that the worst-case running time for insertion-sort on three elements is given by the height of the tree. This will be important next class! Another important thing to note is the leaves. These represent the final order resulting from a run of insertion sort. You'll notice that all 3! permutations of x,y,z appear as leaves. This is in fact necessary, since there needs to be at least one path that gets you to each permuation, since any ordering is possible as input.

The key observation is that a sorting algorithm like insertion sort can be characterized by an infinite sequence of such trees: one for each input size.

Tree for Gnomesort

We looked at the tree for Gnomesort and $n=3$, and saw the indications that Gnomesort isn't a very good sorting algorithm: the tree height was large (meaning bad worst-case number of comparisons) and it had lots of comparison nodes without branches (meaning redundant comparisons).