Lecture Notes for CS 512, Algorithm Design

Asymptotic Estimation, 1 March 2000


  1. how to characterize an algorithm's behavior?

  2. how to compare two algorithms?

  3. measure them

    1. good - definitive

    2. bad - extra work, non-portable, detailed, requires implementation, compaisons hard

  4. need a more abstract estimator

    1. asymptotic or big-oh notation

    2. presents approximate behavior, ignoring details - constants represent details

    3. behaviors as functions of input size, other details abstracted away.

    4. examine what happens as n grows large, that is, approaches infinity asymptotically

    5. best, average, and worst case inputs

      1. generally, smaller is better - worst-case inputs give largest estimates

      2. best case is too unrealistic; average case usually too difficult to characterize; worst case is usual

  5. order notation

    1. big omicron

      1. characterizes an upper bound

      2. g is O(f) if there exists constants c > 0 and N > 0 such that for all n > N, g(n) <= cf(n)

      3. because constants don't matter, use simplest functions; O(n), not O(2n)

      4. monotonic functions; exponentials grow faster than polynomials; n grows faster than log n

      5. addition and multiplication of functions

      6. tight and loose bounds

    2. big omega

      1. characterizes a lower bound

      2. g is Omega(f) if there exists constants c > 0 and N > 0 such that for all n > N, g(n) => cf(n)

    3. big theta

      1. tight bounds

      2. f(n) bounded above and below by g(n)

    4. little omicron and omega

      1. strict bounds, non-tight bounds

    5. keep your eye on the constants

  6. time and space complexity - or any other resource of interest

  7. don't confuse

    1. best, average, and worst case with the asymptotic estimators

    2. a Theta() lower bound with the lower bound


This page last modified on 21 March 2000.