P1

Consider linearSearch and binarySearch from the Unit 1 notes. The analysis in Section 8 shows that the worst case running time in terms of number of operations for linearSearch on an array of size $n$ (let's call this $T_{ls,\text{worst}}(n)$) is \[ T_{ls,\text{worst}}(n) = 2n + 1 \text{ operations} \] and the the worst case running time in terms of number of operations for binarySearch on an array of size $n$ (let's call this $T_{bs,\text{worst}}(n)$) is \[ T_{bs,\text{worst}}(n) = 5 \lg n + 4 \text{ operations} \] whereas the best case for each is \[ T_{ls,\text{best}}(n) = 3 \text{ operations} \] for linearSearch and \[ T_{bs,\text{best}}(n) = 5 \lg n + 4 \text{ operations} \] for binarySearch. Given that, here are some T/F questions to see whether you understand the meanings of $O$, $\Theta$ and $\Omega$ and how they're used to describe running time functions:
  1. $T_{ls,\text{worst}}(n) \in \Theta(n)$
  2. $T_{ls,\text{worst}}(n) \in \Omega(1)$
  3. $T_{bs,\text{worst}}(n) \in O(n)$
  4. $T_{bs,\text{worst}}(n) \in \Omega(\lg n)$
  5. $T_{ls,\text{best}}(n) \in \Omega(n)$
  6. $T_{ls,\text{best}}(n) \in \Theta(1)$
  7. $T_{ls,\text{best}}(n) \in O(n)$
  8. $T_{bs,\text{best}}(n) \in \Omega(n)$
  9. $T_{bs,\text{best}}(n) \in O(\lg n)$
  10. $T_{bs,\text{best}}(n) \in O(n)$

P2

Algorithm: findFirstMissing
Input: $A$ and $B$, both sorted arrays of $n$ ints
Output: $j$, the least index such that $B[j]$ is not an element of $A$, or 'NOT FOUND' if no such element of $B$ exists
j = -1, k = 0
while k ≠ NOTFOUND and j < n-1 do
  j = j + 1
  k = search(A,0,n-1,B[j]) ← we assume "search" is an algorithm for the search problem, with			      
if k = NOTFOUND              the tweak that search(A,left,right,x) searches A[left],...,A[right]			      
  return j                   i.e. we don't always start at index 0. We could equally easily use 			      
else                         linearSearch, binarySearch or gallopSearch.
  return NOTFOUND
	    
I recommend tracing through this algorithm on sample arrays like:
A:  3  5  6 11 15 16 17 20     where n = 8
B   3  5 11 15 17 18 20 21 	      
	    
Here's your problem: give a loop invariant for the while loop that would allow you to prove that this algorithm is correct.

P3

A (directed) forest is a graph that consists of one or more disjoint (directed) trees. A node $s$ in a forest is said to have a unique distination if $s$ and all of its descendents have a single, unique child until you get to a node with no children, which is the "destination".

Consider the problem Unique Destination that takes a forest on $n$ nodes, $\{0,\ldots,n-1\}$, and a starting node $s$ and returns TRUE if $s$ has a unique destination, and FALSE otherwise. The forest is repesented by an array $P$ of length $n$ where $P[i] = j$ means that node $i$'s parent is node $j$, when $j \neq -1$, and $i$ is a root node if $P[i] = -1$. See the image to the right to see an example drawn out.

The following algorithm solves the Unique Destination. Analyze its best and worst case running time. Note: The outer while loop does indeed terminate, as long as the input really is a forest (i.e. no loops). If it were to go on too long, you would eventually hit a node you'd already visted, which would mean there was a loop.

Algorithm: uniqueDest(P,n,s)
Inputs: P an array of n integers such that P[i] = j means that the
          parent of node i is node j if j ≠ -1, and i has no parent otherwise
        s the starting node
Output: TRUE/FALSE a solution to the Unique Destination problem

next = s, count = 1

while count == 1 do
  curr = next
  count = 0, i = 0
  while i < n do     ← in this loop we count the number of children of curr and set next to the most recently seen child
    if P[i] == curr
      count = count + 1
      next = i
    i = i + 1

if count == 0
   return TRUE
else
   return FALSE

P4

One might note that from P3 that in the inner loop we don't really need the exact number of children of node "curr". We just need to know whether there are 0, 1 or more than one. So we could cut out of that inner while loop early by changing it to:
  while i < n and count < 2 do
      
How would that change the best and worst case running time functions?