Relying on what you learned of runtime analysis in
data-structures, what is the worst-case running time of
algorithm linearSearch given below? What is the best case
running time? Both should be given in terms of $n$, the
number of elements in the range being searched.
Algorithm: linearSearch
Input: $(A,i,j,x)$, an instance of the
Sorted Array Search problem
Output: $k$, an index such that $A[k] = x$, if one
exists in $A[i\ldots j]$, and 'NOT FOUND' otherwise
k = i
while k < j and A[k] < x
k = k + 1
if k ≤ j and A[k] == x
return k
else
return 'NOT FOUND'