Lecture Notes for CS 325
Implementation Validation and Testing, 19 March 2001
- verification
- objectives - code works correctly, code does the correct thing
- static and dynamic code verification
- static techniques don't require execution
- pluses - simpler, can be automated, handles incomplete code
- minuses - results less useful
- static and dynamic techniques find different errors
- code reading
- re-abstract the code and compare to the design spec
- a bottom-up procedure
- desk checking to find errors
- code reviews
- more formal, multi-person
- a follow-up technique after other verification and before testing
- designers, implementors, testers
- searching for inconsistencies between design and code
- errors in logic, control, data, and operations
- also non-functional issues such as performance
- static analysis
- tool-based approach to static verification
- cheap because of automation, but effective for the errors it detects
- much of static analysis compiler based, data-flow analysis
- assigned to but not used, used but not assigned to
- other oddities - unused variables, dead code
- simple, fast analysis of limited help
- more extensive, useful analysis can be expensive and limited - alias analysis
- re-write code to enable static analysis - smaller and simpler
- other static analysis tools
- inter-module consistency checkers - like c or c++ prototypes
- cross-reference generators or automatic diagrammers
- feature use frequencies
- enforce or evaluate coding styles
- symbolic execution
- simulates real execution with fake data
- expensive even for small, simple code
- can establish strong and important characteristics of the code -
optimize subtype polymorphism in oo languages
- proving correctness
- more of an avoidance technique than a detection technique
- useful during the input subprocess to design code
- eiffel uses a variant of program correctness as design by contract
dynamic techniques - unit testing
- metrics
- most prior metrics are related to estimating code characteristics
- automated measurements
- size metrics
- measuring size is not simple -
loc
- dloc per working month; kloc total
- total lines; lines of code; libraries, macros, program generators
- halstead metrics
- nopr - number of different operators used
- nopn - number of different operands used
- vocabulary n = nopr + nopn
- Nopr - total number of operators used
- Nopn - total number of operands used
- length N = Nopr + Nopn
- volume V = N log2 n
- high correlation with loc, errors
- some questions as to what operators and operands are
- complexity metrics
- size is a well correlated measure of complexity
- cyclomatic complexity based on number of decisions
- cyclomatic complexity adjusted for decision complexity
- halstead measures = average operator frequency*average operand
frequency
- live variables
- measure the fan-in to a statement
- higher fan-in means more complexity
- live spans - larger spans mean more complexity
- knot count - count un-structured programming control flows
- style metrics
- size of modules, procedures, statements, variable names
- counts of variables, constants, comment lines, gotos
- comparison against historical data
- ranges of acceptable metric values
This page last modified on 19 March 2001.