Data Structures and Algorithms
CS 503

Quiz 2, 23 March 2011


  1. Order the following complexities from best (fastest, least resource use) to worst (slowest, most resource use): O(n2), O(3n), O(n2 log n), O(1), O(n log n), O(n3), O(n!), O(log n), O(n). Justify your ordering.


    O(1) doesn't change as n increases, which means it will be dominated by all other estimates, which do change (get larger) as n increases.

    Of the remaining upper bounds, O(log n) is the slowest growing, followed by O(n), and then O(n log n).

    O(n2) grows more slowly than does O(n3), and O(n2 log n) has intermediate growth because log n grows faster than does 1 but slower than does n.

    Finally, O(n!) grows faster than does O(3n) because, after the first two terms, each factor in n! is strictly larger than the corresponding factor in 3n.

  2. What are the tight, worst-case asymptotic estimates for the functions f1(n) = 3n log n + log n, f2(n) = 2n + n3 + 25, and f3(k, n) = k + n, where kn? Order the resulting estimates from best to worst. Justify your answers.


    Given f1(n) = 3n log n + log n, n log n grows faster than does log n, which means 3n log n + n log n = 4n log n grows faster than does f1, and so f1 = O(n log n).

    Given f2(n) = 2n + n3 + 25, 25 can be ignored because it's constant. n3 always has three factors, each n, while 2n has n factors, each 2. However, 2n can be thought of as having three factors too, each factor being (approximately) 2n/3. This means that once n has a factor that's at least 3 log n (for example, n = 30 has 15 as a factor, and 15 > 3 log 30 because 5 > log 30), 2n/3 will be larger than n, and 2n grows faster than n3. From this, f2(n) = O(2n).

    Given f3(k, n), the point to remember is that the growth is determined as the input grows large (approaches infinity, more technically). In this case, there are two inputs, and they both grow, which means f3 = O(k, n). However, it's reasonable to assume that k's growth is identical to n's growth, in which case k + n = n + n = 2n, and so f3 could be O(n) too.

    The best-to-worst ordering for the functions are f3 (O(n) or O(k,), f1 (O(n log n)), and f2 (O(2n)).

  3. Procedure p adds two square matrices of size n in time proportional to O(n2). Procedure q uses procedure p to add m square matrices, each of size n. What is q’s worst-case asymptotic complexity? Justify your answer.


    The number of matrix additions performed by procedure q is proportional to O(m). Each addition performs work proportional to O(n2). The total work performed by q is then O(m)O(n2) = O(mn2). Unlike the previous question, it would be dangerous to assume that m grew at the same rate as n because the problem assigns the parameters different interpretations (the number of matrices vs the matrix size).

  4. Algorithms A1 and A2 solve the same problem. A1 runs in time proportional to f1(n) = 400n; A2 runs in time proportional to f2(n) = n2. What are the worst-case asymptotic estimates for the time of these two algorithms? For what values of n might it make sense to use the algorithm with the higher complexity? Justify your answers.


    f1(n) is linear in n, while f2(n) is quadratic in n. Assuming an interest in speed, it makes sense to use the faster algorithm. From the equations, A1 is faster then A2 when 400nn2, or, canceling an n from both sides, 400 ≤ n.

  5. True or false: an algorithm without loops always has worst-case constant resource use, no matter what resource is counted. Justify your answer.


    False, because straight-line code could call a subroutine, which could contain loops.


This page last modified on 2011 March 28.