Lecture Notes for Advanced Programming II

16 October 2001 - Asymptotic Estimates


  1. algorithms and programs

    1. programs come from algorithms

    2. good programs come from good algorithms, good programs never come from bad algorithms

  2. algorithm analysis

    1. there are many algorithms, all different

    2. which algorithm can be judged suitable - it depends on your criteria

    3. start with performance

  3. performance analysis

    1. execution-time measurements

      1. good - definitive

      2. bad - extra work, non-portable, detailed, requires implementation, comparisons hard

  4. estimated performance analysis

    1. ignore the details, go for the big picture

      1. algorithm behavior as functions of input size n, other details abstracted away

      2. examine what happens as n grows large, that is, approaches infinity

      3. what kind of output - behavior in time or space; messages; errors; dollars

      4. what kind of input

    2. asymptotic estimates

      1. an upper bound on program behavior - no more than

      2. O(f) - an estimate no worse (larger) than f

    3. a function hierarchy

      1. constant - O(1), hash table searching

      2. log - O(log n), binary search

      3. linear - O(n), linear search

      4. log-linear - 0(n log n), heapsort

      5. quadratic, square - O(n2), bubble sort

      6. cubic - O(n3), matrix multiplication

      7. exponential - O(2n), optimal route finding

      8. polynomial functions - quadratic, cubic, and so on

      9. the boundary between polynomial and exponential functions

    4. manipulating asymptotic estimates

      1. the larger estimates swallows up the smaller estimate

  5. structural analysis

    1. programs have a regular structure - sequence, loop, choice

    2. use the structure to derive asymptotic estimates

      1. assign behavior to smaller units, then combine units following algorithm structure to find behavior of larger units.

      2. statements

        1. usually O(1), but know your data structures - eg a[i] vs. put_queue(q, i) vs. put_priority_queue(q, i)

      3. statement sequence - s1 ; s2

        1. s1 is O(f), s2 is O(g), so s1; s2 is O(f) + O(g)

        2. but what is O(f) + O(g)

          1. treat O(f) as a set of functions, pick a representative function and apply the definition of big-oh.

          2. plus, max, constant

          3. example: i <- 1 ; heapify(h, i)

      4. loops - do b s od

        1. s gets executed some number of times - asymptotically bound the number of times

        2. O(f)*O(g) - what is that

        3. implicit, and occasionally troublesome, assumption - the loop itself does constant work per iteration.

      5. if statement - if b1 s1 or b2 s2 fi

        1. execute either s1 or s2 - which one

        2. don't forget b1 and b2

      6. what about subroutine calls and recursion

        1. non-recursive - treat like a statement; guess or know the estimate

        2. recursive - later for that; recurrence relations

  6. things to keep in mind

    1. keep an eye on the constants

      1. asymptotically identical algorithms

      2. for small n, constants are more important - linear search is usually good enough

    2. estimate the right things - what's important; an expensive algorithm, or the number of times it's called

    3. look at the right inputs - what's driving the algorithm; worrying about execution time on communication-bound computations; the intel-investors' fallacy


This page last modified on 6 December 2001.