Lecture Notes for CS 512, Algorithm Design
Asymptotic Estimation, 1 March 2000
- how to characterize an algorithm's behavior?
- how to compare two algorithms?
- measure them
- good - definitive
- bad - extra work, non-portable, detailed, requires implementation,
compaisons hard
- need a more abstract estimator
- asymptotic or big-oh notation
- presents approximate behavior, ignoring details - constants represent
details
- behaviors as functions of input size, other details abstracted away.
- examine what happens as n grows large, that is, approaches infinity
asymptotically
- best, average, and worst case inputs
- generally, smaller is better - worst-case inputs give largest
estimates
- best case is too unrealistic; average case usually too difficult to
characterize; worst case is usual
- order notation
- big omicron
- characterizes an upper bound
- g is O(f) if there exists constants c > 0 and N >
0 such that for all n > N, g(n) <= cf(n)
- because constants don't matter, use simplest functions; O(n),
not O(2n)
- monotonic functions; exponentials grow faster than polynomials;
n grows faster than log n
- addition and multiplication of functions
- tight and loose bounds
- big omega
- characterizes a lower bound
- g is Omega(f) if there exists constants c > 0 and
N > 0 such that for all n > N, g(n) => cf(n)
- big theta
- tight bounds
- f(n) bounded above and below by g(n)
- little omicron and omega
- strict bounds, non-tight bounds
- keep your eye on the constants
- time and space complexity - or any other resource of interest
- don't confuse
- best, average, and worst case with the asymptotic estimators
- a Theta() lower bound with the lower bound
This page last modified on 21 March 2000.