Lecture Notes for CS 325

Design Validation and Metrics, 26 February 2001


  1. metrics

    1. objectives - design (quality) and management (quantity)

      1. catch design errors - omission or commission; relative to the requirements

      2. improve design structure - weak or dangerous structure; promote re-use

      3. management information - measuring resource use

    2. metrics correlate with other important measures, such as error rates or project resource use

    3. coupling metrics

      1. measure the independence of the modules and simplicity of the design

      2. network metrics

        1. module interconnectedness

        2. the more tree-like, the better

        3. number of boxes - number of arrows = 1

        4. low correlation with anything

      3. information-flow metrics

        1. augment network metrics with information flowing over the arrows

        2. information in + information out

        3. weighting information flow by connectivity - fan in and fan out

        4. low correlation with anything

      4. stability or simplicity metrics

        1. measure the complexity of the information flowing

        2. simple information exerts minimal influence over other modules

    4. cohesion metrics

      1. difficult to quantify

      2. all procedures (or methods) use all variables

      3. information-flow metrics within a module

      4. difficult for functional design due to lack of module innards

    5. ood metrics

      1. many, most new

      2. class complexity

        1. evaluate method complexities, sum the evaluations to get class complexity

        2. good correlation with error occurrences

      3. inheritance tree depth

        1. the number of ancestor classes

        2. good correlation with error occurrences

      4. child class count

        1. poor correlation with error occurrences

        2. a re-use measure

      5. include aggregation in the metric

        1. count classes used by a class - variables and parameters

          1. high correlation with error occurrences

        2. adjust by methods actually called

    6. automating measurements with case tools and other formal techniques

  2. validation

    1. objectives

      1. catch design errors - omission or commission

      2. improve design structure - weak modularity or re-use

    2. metrics also serve as design validation

    3. design reviews

      1. large reviewing groups - designers, implementors, requirements, testers

      2. varying degrees of formality and tools

      3. find, not fix problems; be nice

      4. effective at finding problems

      5. logical design reviews can be more informal and need fewer people than architectural design reviews

  3. how to use statistics

    1. comparison against historical data

    2. outliers

    3. standard deviations from the mean


This page last modified on 26 February 2001.