- algorithms and programs
- programs come from algorithms
- good programs come from good algorithms, good programs never come from
bad algorithms
- algorithm analysis
- there are many algorithms, all different
- which algorithm can be judged suitable - it depends on your criteria
- start with performance
- performance analysis
- execution-time measurements
- good - definitive
- bad - extra work, non-portable, detailed, requires implementation,
comparisons hard
- estimated performance analysis
- ignore the details, go for the big picture
- algorithm behavior as functions of input size n, other details
abstracted away
- examine what happens as n grows large, that is, approaches
infinity
- what kind of output - behavior in time or space; messages; errors;
dollars
- what kind of input
- asymptotic estimates
- an upper bound on program behavior - no more than
- O(f) - an estimate no worse (larger) than f
- a function hierarchy
- constant - O(1), hash table searching
- log - O(log n), binary search
- linear - O(n), linear search
- log-linear - 0(n log n), heapsort
- quadratic, square - O(n2), bubble sort
- cubic - O(n3), matrix multiplication
- exponential - O(2n), optimal route finding
- polynomial functions - quadratic, cubic, and so on
- the boundary between polynomial and exponential functions
- manipulating asymptotic estimates
- the larger estimates swallows up the smaller estimate
- structural analysis
- programs have a regular structure - sequence, loop, choice
- use the structure to derive asymptotic estimates
- assign behavior to smaller units, then combine units following
algorithm structure to find behavior of larger units.
- statements
- usually O(1), but know your data structures - eg a[i]
vs. put_queue(q, i) vs. put_priority_queue(q, i)
- statement sequence -
s
1 ;
s
2
s
1 is O(f), s
2 is O(g), so s
1; s
2 is
O(f) + O(g)
- but what is O(f) + O(g)
- treat O(f) as a set of functions, pick a representative
function and apply the definition of big-oh.
- plus, max, constant
- example: i <- 1 ; heapify(h, i)
- loops -
do b s od
- s gets executed some number of times - asymptotically bound the
number of times
- O(f)*O(g) - what is that
- implicit, and occasionally troublesome, assumption - the loop itself
does constant work per iteration.
- if statement -
if b1 s
1 or b2 s
2 fi
- execute either
s
1 or s
2 - which one
- don't forget b1 and b2
- what about subroutine calls and recursion
- non-recursive - treat like a statement; guess or know the estimate
- recursive - later for that; recurrence relations
- things to keep in mind
- keep an eye on the constants
- asymptotically identical algorithms
- for small n, constants are more important - linear search is
usually good enough
- estimate the right things - what's important; an expensive algorithm,
or the number of times it's called
- look at the right inputs - what's driving the algorithm; worrying about
execution time on communication-bound computations; the
intel-investors' fallacy
This page last modified on 6 December 2001.