sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======             April  1997             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the worldwide software testing
community.

(c) Copyright 1997 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition
provided that the entire document/file is kept intact and this copyright
notice appears with it.

========================================================================

INSIDE THIS ISSUE:

   o  "SOFTVERF" Mailing List Opportunity

   o  Quality Week 1997 -- Technical Program -- Schedule Order

   o  Test Planning -- Enterprise Wide Year 2000, by Larry W. Cooke

   o  A Purely French Solution to the Year 2000 Problem

   o  Software Quality HotList Updated

   o  Setting Up a Software Inspection Program, by Don O'Neill

   o  TestWorks International Distributors

   o  Evaluating TTN-Online

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

                   SOFTVERF MAILING LIST OPPORTUNITY

The SOFTVERF mailing list is an electronic discourse community for
people interested in formal verification of software.  "Formal" means
having a formal or mathematical basis.  "Verification" means proving
that an implementation corresponds to or satisfies a specification.  It
is often referred to, somewhat inaccurately, as proving software to be
correct.  Related work, such as testing, software engineering, design
methods, etc., is not covered so we can focus on this one aspect.  The
ultimate aim is to make software verification a standard part of
software engineering.

To subscribe, send email to:

        majordomo@lal.cs.byu.edu

The body of the message must contain:

        subscribe softverf

You can unsubscribe at any time.  You should get a confirmation message
quickly, usually within a few minutes.

Messages sent to SOFTVERF are archived at the URL:

        http://lal.cs.byu.edu/mlists/softverf/softverf.html

========================================================================

       QUALITY WEEK 1997 -- TECHNICAL PROGRAM  -- SCHEDULE ORDER

Here is the complete technical program for QW'97 (as of 15 April 1997).
Complete details on QW'97 including on-line registration for this our
10th Anniversary Event, are available at the QW'97 homepage:

                      http://www.soft.com/QualWeek

    o       o       o       o       o       o       o       o

                           T U T O R I A L S

      Tuesday, 27 May 1997, 8:30 - 12:00 -- AM Half-Day Tutorials

Dr. Boris Beizer (Independent Consultant) "An Overview of Testing (TA)"

Mr. Robert V. Binder (RBSC Corporation) "Test Automation for Object
Oriented Systems (TB)"

Mr. William J. Deibler & Mr. Bob Bamford (Software Systems Quality
Consulting) "Software Engineering Models for Quality: Comparing the SEI
Capability Maturity (CMM) to ISO 9001 (TC)"

Mr. Cem Kaner (Falk & Nguyen) "Software Related Law (TD)"

Mr. Tom Gilb (Independent Consultant) "Optimizing Software Inspections
(TE)"

       Tuesday, 27 May 1997, 1:30 - 5:00 -- PM Half-Day Tutorials

Mr. John Musa (Consultant) "Applying Operational Profiles in Testing
(TF)"

Mr. Bob Poston (Aonix) "10X Testing: Automating Specification-Based
Testing (TG)"

Mr. Matthew Cooke (European Software Institute) "Introduction to SPICE
(ISO 15504 Software Process Assessment) (TH)"

Mr. Nicholas Zvegintzov (Software Management Network) "Testing for Year
2000 (TI)"

Mr. Michael Deck (Cleanroom Software Engineering, Inc.)  "Cleanroom
Development: Formal Methods, Judiciously Applied (TJ)"

    o       o       o       o       o       o       o       o

                   T E C H N I C A L   P R O G R A M

       Wednesday, 28 May 1997, 8:30 - 10:00 -- KEYNOTE SESSION #1

Mr. John Seddon (Consultant) "In Pursuit of Quality: The Case Against
ISO 9000 (1P1)"

Prof. Dorothy Denning "Cyberspace Attacks and Countermeasures: How
Secure IS the Internet? (1P2)"

    o       o       o       o       o       o       o       o

                 Wednesday, 28 May 1997, 10:30 - 5:00
                       Parallel Technical Tracks

TECHNOLOGY TRACK...

Prof. Antonia Bertolino & Martina Marre (IEI-CNR) "A General Path
Generation Algorithm for Coverage Testing (2T1)"

Mr. John Musa (Consultant) "Applying Operational Profiles in Software-
Reliability-Engineered Testing (2T2)"

Dr. James R. Lyle & Dolores R. Wallace (National Institute of Standards
and Technology) "Using the Unravel Program Slicing Tool to Evaluate High
Integrity Software (3T1)"

Hugh McGuire, Prof. Laura K. Dillon & Y. S. Ramakrishan (University of
California, Santa Barbara) "Generating Trace Checkers for Test Oracles
(3T2)"

Mr. Victor Braberman & Ms. Martina Marre & Mr. Miguel Felder
(Universidad de Buenos Aires) "Testing Timing Behaviors of Real Time
Software (4T1)"

Mr. Joachim Wegener & Mr. Matthias Grochtmann (Diamler-Benz AG) "Testing
Temporal Correctness of Real-Time Systems by Means of Genetic Algorithms
(4T2)"

APPLICATIONS TRACK...

Ms. Jennifer Davis & Dr. Daniel Ziskin & Dr. Bryan Zhou (NASA Goddard
Space Flight Center) "The Challenge of Testing Innovative Science
Software on a Rapidly Evolving Production Platform (2A1)"

Mr. Plinio Vilela & Jose C. Maldonado & Mario Jino (DCA-FEEC-UNICAMP)
"Data Flow Based Testing of Programs with Pointers -- A Strategy Based
on Potential Use (2A2)"

Mr. Michael E. Peters (Digital Equipment Corporation) "Managing Test
Automation: Reigning in the Chaos of a Multiple Platform Test
Environment (3A1)"

Mr. Danny R. Faught (Hewlett Packard Company) "Experiences with OS
Reliability Testing on the Exemplar System (3A2)"

Mr. Taghi Khoshgoftaar (Florida Atlantic University) "Identifying
Fault-Prone Modules: A Case Study (4A1)"

Mr. Ram Chillarege (IBM) "Orthogonal Defect Classification (4A2)"

MANAGEMENT TRACK...

Mr. Kerry Zallar (Pacific Bell) "An Approach to Functional Test
Automation (2M1)"

Mr. John Hedstrom & Mr. Dennis J. Frailey (Texas Instruments) "Mastering
the Hidden Cost of Software Rework (2M2)"

Mr. Lech Krzanik & Dr. Jouni Simila (CCC Software Professionals Oy)
"Incremental Software Process Improvement Under "Smallest Useful
Deliverable" (3M1)"

Mr. Don O'Neill "National Software Quality Experiment, A Lesson in
Measurement (3M2)"

Mr. Paul Taylor (Fujitsu Software Corporation) "Workware & ISO 9000
(4M1)"

Ms. Ana Andres (European Software Institute) "ISO-9000 Certification as
a Business Driver: The SPICE (4M2)"

QUICK-START TRACK MINI-TUTORIALS...

Mr. Tom Drake (NSA Software Engineering Center) "Managing Software
Quality -- How to Avoid Disaster and Achieve Success (2Q)"

Mr. Tom Gilb (Independent Consultant) "Making Contracts Testable (3Q)"

Mr. James Bach (STL) "How to be an Expert Tester (4Q)"

SPECIAL PANEL SESSION (5:00 to 6:00)...

Mr. Brian Lawrence (Chair), Cem Kaner, Tom Arnold, Doug Hoffman (Falk &
Nguyen) "Panel Session: Improving the Maintenance of Automated Test
Suites (4P)"

    o       o       o       o       o       o       o       o

       Thursday, 29 May 1997, 8:30 - 12:00 -- KEYNOTE SESSION #2

Mr. Lawrence Bernstein (National Software Council) "Software Dynamics:
Planning for the Next Century (5P1)"

Prof. Dick Hamlet (Portland State University) "Keeping The "Engineering"
In Software Engineering (5P2)"

    o       o       o       o       o       o       o       o

                  Thursday, 29 May 1997, 10:00 - 5:00
                       Parallel Technical Tracks

TECHNOLOGY TRACK...

Prof. Lee J. White & Khalil Abdullah (Case Western Reserve University)
"A Firewall Approach for the Regression Testing of Object-Oriented
Software (6T1)"

Dr. Daniel Jackson (Carnegie Mellon University) "Automatic Analysis of
Object Models (6T2)"

Ms. Dolores R. Wallace & Mr. Herbert Hecht (National Institute Of
Standards & Technology) "Error Fault and Failure Data Collection and
Analysis (7T1)"

Prof. William Howden (University of San Diego) "Partial Statistical Test
Coverage and Abstract Testing (7T2)"

Mr. Huey-Der Chu & Prof. John E. Dobson (University of Newcastle upon
Tyne) "An Integrated Test Environment for Distributed Applications
(8T1)"

Mr. Fraser Macdonald & James Miller (University of Strathclyde
Department of Computer Science) "Automated Generic Support for Software
Inspection (8T2)"

APPLICATIONS TRACK...

Mr. Otto Vinter (Bruel & Kjaer) "How to Apply Static and Dynamic
Analysis in Practice (6A1)"

Mr. Joe Maybee (Tektronix, Inc.)  "Mother2 and MOSS: Automated Test
Generation from Real-Time Requirements (6A2)"

Mr. Peter Middleton & Mr. Colm Dougan (Queens University Belfast) "Grey
Box Testing C++ via the Internet (7A1)"

Mr. Sankar L. Chakrabarti & Mr. Harry Robinson (Hewlett Packard
Company) "Catching Bugs in the Web: Using the World Wide Web to Detect
Software Localization Defects (7A2)"

Mr. Rob Oshana (Texas Instruments) "Improving a System Regression Test
With a Statistical Approach Implementing Usage Models Developed Using
Field Collected Data (8A1)"

Mr. Larry Apfelbaum (Teradyne Software & Systems Test) "Model Based
Testing (8A2)"

MANAGEMENT TRACK...

Mr. Andrew A. Gerb (Space Telescope Science Institute) "Delivering
Quality Software in Twenty-Four Hours (6M1)"

Mr. Brian G. Hermann & Mr. Amritt Goel (U.S. Air Force) "Software
Maturity Evaluation: Can We Predict When Software Will be Ready for
Fielding? (6M2)"

Ms. Johanna Rothman (Rothman Consulting Group) "Is your Investment in
Quality and Process Improvement Paying Off?? (7M1)"

Ms. Marilyn Bush (Xerox Corporation) "What is an SEPG's Role in the CMM?
(7M2)"

Mr. G Thomas (Vienna University of Technology) "On Quality Improvement
in "Heroic" Projects (8M1)"

Mr. Ian Wells (Hewlett Packard) "Hewlett Packard Fortran 90 compiler
project case study (8M2)"

QUICK-START TRACK MINI-TUTORIALS...

Mr. Brian Marick (Testing Foundations) "The Test Manager at The Project
Status Meeting (6Q)"

Mr. Robert M. Schultz (Motorola) "Test Coverage Analysis (7Q)"

SPECIAL PANEL SESSION

"How Does Java Address Software Quality: A Special Panel Session (8P)"

    o       o       o       o       o       o       o       o

                   Friday, 30 May 1997, 8:30 - 10:00
                       Parallel Technical Tracks

TECHNOLOGY TRACK...

Mr. Marc Roper (Strathclyde University) "Computer Aided Software Testing
Using Genetic Algorithms (9T1)"

Mr. Istvan Forgacs & Eva Takacs (Computer and Automation Institute)
"Mutation-based Regression Testing (9T2)"

APPLICATIONS TRACK...

Ms. Deborah MacCallum (Bay Networks) "Test Automation Solutions for
Complex Internetworking Products (9A1)"

Mr. Albrecht Zeh & Paul Bininda (Sekas GmbH) "Quality Improvement by
Automatic Generation and Management of Test Cases (9A2)"

MANAGEMENT TRACK...

Mr. Nick Borelli (Microsoft) "Tuning your Test Group: Some Tips & Tricks
(9M1)"

Mr. Dave Duchesneau (The Boeing Company) "Guerrilla SQA (9M2)"

QUICK-START TRACK MINI-TUTORIALS...

Mr. Craig Kaplan (I.Q. Company) "Secrets of Software Quality (9Q)"

    o       o       o       o       o       o       o       o

   Friday, 24 May 1997, 10:30 - 12:30 -- CLOSING KEYNOTE SESSION #3

Prof. Lori A. Clarke (University of Massachusetts) "Gaining Confidence
in Distributed Systems (10P1)"

Dr. Boris Beizer (Independent Consultant) "Y2K Testing: The Truth of the
Matter (10P2)"

    o       o       o       o       o       o       o       o

                  CONFERENCE REGISTRATION INFORMATION

Complete registration with full information about the conference is
available on the WWW at

                     <http://www.soft.com/QualWeek>

where you can register on-line.

We will be pleased to send you a QW'97 registration package by E-mail,
postal mail or FAX on request.  Send your E-mail requests to:

                              qw@soft.com

or FAX or phone your request to SR/Institute at the numbers below.

         QW'97: 27-30 May 1997, San Francisco, California  USA

+-----------------------------------+----------------------------------+
| Quality Week '97 Registration     | Phone:       [+1] (415) 550-3020 |
| SR/Institute, Inc.                | Toll Free:        1-800-942-SOFT |
| 901 Minnesota Street                  | FAX:         [+1] (415) 550-3030 |
| San Francisco, CA 94107 USA USA  | E-Mail:              qw@soft.com |
|                                   | WWW:         http://www.soft.com |
+-----------------------------------+----------------------------------+

========================================================================

                            TEST PLANNING -
                       ENTERPRISE WIDE YEAR 2000

                           by Larry W. Cooke

Prologue: I feel somewhat compelled to deliver a message I have been
stressing for the last 15 years - testing is iterative. Develop plans
and processes top to bottom that enable on-going reuse.  Embrace Year
2000 as an opportunity to build equity in your organization's test
methodology and infrastructure.

Project Planning:

The code modifications made to support date compliance are not complex,
yet the management of the various application-specific modifications and
interface validations  to certify enterprise wide compliance requires
significant planning and orchestration.  The "chicken or egg" principle
exacerbates the complexity of creating an effective planning document.
Central planners need input from the various application groups but the
individual application groups want enterprise direction before they make
commitments.

No, I don't have a solution to the infamous chicken or egg conundrum;
but my experience suggests not worrying about how the chicken got there,
because its there.  In this case the chicken is the year 2000 and for
organization's planning to do business in the next millennium, their
systems must be Y2K compliant. And unless an organization has managed to
maintain autonomy between all applications, a top down strategy is
essential to establishing the required cooperation and synergy necessary
to enterprise wide success.

Your enterprise Y2K plans and objectives should provide the impetus to
your testing objectives and corresponding strategies.  These enterprise
renovation strategies vary widely reflecting their diversity in
structure, risk of change, mix of internal versus purchased
applications, timeliness of project startup, age of systems, size of the
corporation, technical maturity of information systems, imminent plans
for major changes to legacy applications,  and resource availability.

These strategies should dictate responsibilities for renovating and
testing "home grown" application code; internal date standards; vendor
code change tracking and acceptance; acquisition and use of tool suites
and utilities; system interface development and compliance; change
management and change control processes; etc.

Test Planning Considerations

Enterprise wide testing requirements associated with Y2K projects or
other large legacy system renovations require effective management and
control of interface test data to ensure cohesive and meaningful test
results. This article discusses the planning considerations and controls
required for successful integrated systems testing.

So, where do I want to go?

Year 2000 testing has been completed.  Our results:

 * All interfacing applications have evidenced successful processing of
   date sensitive information, illustrating no loss in internal
   functionality or interface integrity.

 * Date critical functionality has been explicitly defined and evidence
   of Y2K compliance is traceable to these definitions.

 * Test development expense has been minimized by the re-utilization of
   test data through system and integrated systems test phases.

 * Year 2000 (Y2K) compliance has been demonstrated for current century
   processing, century cross-over processing, and post 2000 processing.

 * All testing has been completed on time and within budget.

 * Test processes, test data and disciplines used during the Y2K project
   have been documented and retained for continued application
   verification into the next millennium.

I submit that these results cannot be achieved without:

 * A strategic plan providing a top down picture communicating to the
   enterprise how this is to occur

 * Test Tools to support management, communication and execution
   processes

 * Synchronized test calendar to facilitate contextual data uniformity

 * Enterprise commitment to building equity from Y2K

Master Test Plan

The Master Test Plan, MTP, provides the overall guidelines for the
testing effort to be performed.  The MTP is a strategic document
illuminating the enterprise wide testing objectives and approaches to be
employed to ensure timely and quality implementation of Y2K
modifications.  The MTP contributes critical information necessary to
achieving synchronization of effort across the entire organization.
These contributions include:

 * High level strategic definitions of staged testing processes to
   verify date related system modifications; test environment
   requirement definition to support interface validation; definition of
   structured processes and controls that will enable defect detection,
   reduce iterative development and direct storage and  retention of
   incremental test data; provide communication infrastructure necessary
   to ensure timely communications; and define strategic support
   information and decision processes necessary to the formulation of
   detail test plans.

 * Documentation of assumptions pertinent to con-version processes and
   schedules; resource requirements and availability; identification of
   known tools to be used during the project and a definition of their
   impact - test management, test drivers, extract routines, system date
   manipulation tools, database manipulation tools and comparison
   utilities; and include the description of any unique requirements,
   interrelationships that impose constraints requiring creative
   solutions; and describe any hardware or system software
   implementations upon which the project's success depends.

 * Strategic extrapolations including conversion and database
   configuration, partnerships and integration within and outside the
   project; definition of system and personnel resources; test tool and
   test utility requirements;  test data generation aids, and
   identification of external resources.

 * Test phase definitions, test condition development processes, script
   (detail test transaction) development processes; definition of
   quality assurance processes; and any other testing efforts to be
   applied - operations, performance, security, procedure, etc.
   (generally not applicable to most Y2K projects).

 * Provide overall schedule and time-line of testing activities.  This
   will enable all participants to know when their support is required.

Test Calendar

In most larger corporations there are numerous applications with varying
date related requirements. Some applications are on-line with daily
batch processing functions for creating bills, merging database record
updates from related systems, producing reports, etc. Certain dates such
as the day of week, mid-month, month end, quarter end, year end, billing
cycle dates, first business day after a holiday, mid-week holiday,
holidays following a week-end, etc. have a significant impact on the
functionality of an application.

To execute the code or process an interface file, it may be necessary to
schedule tests wherein the processing date is set for these unique
dates.  The test permutations are driven by these date related events.
The test calendar reflects the required dates for the system, interface,
or systems integration tests to be performed.

The test calendar is comprised of logical processing days to be included
in the test. A clear definition of this processing event time segment is
essential to creating clear test data requirements, establishing
effective controls and defining meaningful verification checkpoints.  A
logical test day may not be defined in test the same as a production
processing day.  A processing day might be defined in test as:

        Online Input Processing
        Batch Interface Feeds
        Batch Processing
        After Batch On-line validations
        Report output and interface file validations

Although in production, the on-line file date might be the subsequent
business day, in test we would want to complete the verifications
associated with the previous on-line input prior to proceeding to
entering additional update transactions. This prevents the need to back
track if verification errors are identified.  Checkpoints established
throughout the logical test day help ensure that all required data by
each participating system is properly included in the test.  There are
test management tools, such as Testmasters' Test Control System, that
aid in the development of the test data and orchestrating test day
execution and validation.

Another critical aspect of logical test day planning and execution is
establishing common relational data elements across systems.  To
logically invoke the required program logic, interfacing systems must
synchronize the relationships used in the test.   This becomes
increasingly important with a greater number of interfaces.

If you check the URL "http://www.testtools.com" you can download a
Master Test Plan template that will provide additional strategic
considerations for Year 2000.  The MTP provides the road map for all
other detail test plans facilitating the necessary uniformity that will
enable your testing efforts to be iterative.

Detail system, integrated systems and user acceptance test plans are
also essential to effective testing.  If the MTP has provided the
essential  strategic  direction, these staged test plans will foster
compatibility,  expediting your testing efforts.

Managing large system integration tests is a major undertaking. The test
manager's role is akin to a symphony conductor; to achieve melodious
harmony each instrument must participate on queue.  Create your visions
through effective planning and I am confident you will make beautiful
music.

========================================================================

           A PURELY FRENCH SOLUTION TO THE YEAR 2000 PROBLEM

                   Paris, Tuesday, 1st of April 1997

The French Ministry of Informatics (MOI) today announced that they have
determined that French computer systems will not be affected by the year
2000 problem.  An extensive series of tests have been run on a wide
range of applications within the country and on no system has a Y2K
problem been apparent.

A spokesman put this good fortune down to a side-effect of the French
number system.  In this system the number eighty is represented by the
composite "quatre vingts" -- literally "four twenties."  French computer
systems represent the "quatre" as a single digit and will harmlessly
roll over to "cinq vingts" or "five twenties" while the rest of the
world collapses.  Thus, "quatre vingts dix neuf" will increment to "cinq
vingts."

French speaking areas of Belgium and Switzerland are bemused by these
developments, because they still use the older "septant, octant, nonant"
system for 70, 80, and 90.  The Belgian government is thought to be
considering an urgent change in the language.  This would provide a
major boost for the less prosperous French speaking part of the country
when computer systems are relocated to French speaking communes.

Microsoft has announced that it will use similar techniques to guarantee
the PCs will not suffer from such problems, by launching a new version
of their operating system.  "Windows ninety ten" is expected to be
available in the year 2002.

========================================================================

                  SOFTWARE QUALITY HOTLIST(tm) UPDATED

Workers in the software quality area are invited to take advantage of
the Software Quality HotList(tm), with > 550 inks to world's main
technical resources in software quality and software testing.  Feedback
has been extremely positive and many are getting good use of this
compendium.

The URL for the Software Quality HotList is:

        http://www.soft.com/Institute/HotList

Many new links have been added in the past three weeks, largely as the
result of readers suggestions.  Send us word of additional hotlinks,
plus any corrections or modifications to any hotlinks already on the
lists, are welcome at "info@soft.com".

========================================================================

                SETTING UP A SOFTWARE INSPECTION PROGRAM

                              Don O'Neill
                         Independent Consultant

ABSTRACT:  Software inspections deliver an attractive return on
investment and benefits in cost, schedule, quality, and customer
satisfaction.  As a reasoning activity among peers, software inspections
serve as exit criteria for the activities of the lifecycle to conduct
strict and close examinations of specifications, designs, code, and test
procedures.  This article shows the benefits of software inspections and
the cost and methods to roll them out.


Software inspections are a disciplined best engineering practice to
detect software defects and prevent their leakage into field operations.
They were introduced at IBM in the 1970s by Michael Fagan, who pioneered
their early adoption and later evolution [1,2].  Software inspections
provide value by improving reliability, availability, and
maintainability.

The adoption of software inspections practice is competency enhancing
and meets little resistance among practitioners.  The adopting
organization benefits from improved predictability in cost and schedule
performance, reduced defects in the field, increased customer
satisfaction, and improved morale among practitioners.

The return on investment for software inspections is defined as net
savings divided by detection cost [3], where net savings is cost
avoidance minus the cost to repair now.  Detection cost is the cost of
preparation effort and the cost of conduct effort (see Costs and
Limitations below).

Savings result from early detection and correction, which avoids the
increased costs that come with the detection and correction of defects
later in the lifecycle.  Defects that escape detection and leak to the
next phase will likely cost at least 10 times more to detect and correct
than if they are found and fixed in the phase in which they originate.
In fact, IBM Rochester, the 1990 winner of the Malcolm Baldrige Award,
reported that defects leaking from code to test cost nine times more to
detect and correct, and defects leaking from test to the field cost 13
times more.

A code defect that leaks into testing may require multiple test
executions to confirm the error and additional executions to obtain
debug information.  Once a leaked defect has been detected, the
producing programmer must put aside the task at hand and refocus
attention on correcting the defect and confirming the correction, then
return to the task at hand.

The National Software Quality Experiment [4] reveals that the return on
investment (net savings divided by detection cost) for software
inspections ranges from four to eight times, independent of usage
context.

Usage Considerations

Although software inspections originated and evolved in new development,
their usefulness in maintenance is now well established.  Certain
measurements obtained during software inspections reflect this context
of use.  For example, the lines of code inspected per conduct hour range
from 250 to 500 for new development and from 1,000 to 1,500 for
maintenance [4].

Organizations that adopt software inspection practices seek to prevent
defect leakage.  Immediately after proper training, organizations can
expect to detect 50 percent of the defects present.  But within 12 to 18
months, an organization can likely achieve expert practice in which
defect detection may range from 60 percent to 90 percent.  Organizations
with mature software inspections practices have achieved these rates.
IBM reported 83 percent and American Telephone and Telegraph Co.
reported 92 percent for defect detection resulting from software
inspections practice [5].

Maturity

The technology of software inspections has enjoyed long-term, widespread
use in a variety of usage domains and has continuously evolved for more
than 25 years.  Software inspections are known to add economic value.

Software inspections are a rigorous form of peer reviews, a key process
area of the Software Engineering Institute's Capability Maturity Model
[6,7].  Peer reviews and other activities that utilize software
inspection practices are assigned to Level 3 in the software process
maturity framework.  Organizations at any process maturity should seek
their benefits.  Successful software inspections adopters can be found
in Level 1 to Level 5 organizations.  Early adoption of software
inspection practices also simulates the improvements needed to achieve
Level 3, with additional benefits such as cross pollination of ideas,
ability to work together, and team building.

Costs and Limitations

The roll out and operating costs associated with software inspections
include

o  initial training of practitioners and managers.
o  ongoing preparation and conduct of inspection sessions.
o  ongoing management and use of measurement data for defect prevention.

To properly adopt software inspections practices, each participant is
trained in the structured review process, defined roles of participants,
system of process and product checklists, and forms and reports.  Each
practitioner needs roughly 12 hours of instruction to acquire the
required knowledge, skills, and behaviors [5].  In addition, each
manager must learn the responsibilities for rolling out the technology
and how to interpret and use the measurements.  This management training
can be accomplished in about four hours.

The cost to perform a software inspection includes the pre-session
preparation time for each participant and the time spent during the
session.  Typically, five people participate, with each spending one to
two hours in preparation and one to two hours meeting.  The cost of 10
to 20 hours of total effort per session typically results in the early
detection of five to 10 defects in 250 to 500 lines of new development
code or 1,000 to 1,500 lines of legacy code [5].  Although the
preparation and conduct effort is comparable for artifacts other than
code, the size of the artifact being inspected varies widely, i.e., five
to 50 pages.  Although the cost of the follow-up activities to correct
defects is an important management measurement, it is useful to
distinguish the cost of repair from the cost of detection.

Three steps are involved in the management and use of measurement data:

o  The organization creates a software inspections database structure.
o  Measurement results are entered, populating the database structure.
o  Operations on the measurement database are used to generate derived
   metrics in the form of reports and graphs.

It takes approximately two person-months to establish a database
structure and to produce the basic user macros that operate on the data.
The cost to populate the database with measured results is included in
the above stated cost to perform software inspections; the recorder for
each inspection session enters session data into the software
inspections database.  Each month, roughly one person-day is required to
operate the macros that generate reports and graphs on the derived
metrics.

Dependencies

To obtain the full benefits of software inspections, a defined process
for software product engineering must already be in place.  This permits
software inspections to be used in statistical process control.  In this
context, software inspections provide the exit criteria for each
lifecycle activity.  Furthermore, the completion criteria for each type
of artifact is specified and used in practice.

Alternatives

Software inspections are a rigorous form of peer reviews; the less
vigorous form is called a software walk-through.  Walk-throughs can be
implemented into less mature processes, since the results of these
producer-led reviews are not recorded.  They are a type of prelude to
the statistical process control practices needed to advance software
process maturity.  Unfortunately, walk- throughs deliver fewer results
than inspections, yet they can cost just as much.

Complementary Technologies

To optimize the practice of software inspections on legacy code during
maintenance operations, all modules are rank ordered according to
cyclomatic complexity [8].  Candidates for inspection are selected from
those with the highest complexity rating, where the defect density is
expected to be high.

This legacy code maintenance strategy can be extended by rank ordering
all modules based upon defects encountered in the past year and by rank
ordering the modules that are expected to be adapted and perfected in
the coming year.  Modules for inspection are then selected based on
their rank ordering in cyclomatic complexity, defect history, and
expected rework.

Technical Detail

Software inspections are strict, close examinations that are conducted
on specification, design, code, test, and other artifacts [9].  Leading
software indicators of excellence for each artifact type provide the
exit criteria for the activities of the software lifecycle [10, 11].  By
detecting defects early and preventing their leakage into subsequent
activities, the higher cost of later detection and rework is eliminated,
which is essential for reduced cycle time and lower cost.

Software inspections are composed of four elements:

1. The structured review process is a systematic procedure that is
   integrated with the activities of the lifecycle model.  The process
   is composed of planning, preparation, entry criteria, conduct, exit
   criteria, reporting, and follow-up.
2. Defined role of participants.  Software inspections are a review
   activity performed by peers that play the defined roles of moderator,
   recorder, reviewer, reader, and producer.  Each role consists of
   behaviors, skills, and knowledge needed to achieve expertise in
   software inspections [12].
3. A system of checklists governs each step in the structured review
   process and in the review of the product, objective by objective.
   Process check-lists are used to guide each activity in the structured
   review process.  Product checklists contain strongly preferred
   indicators that establish completion criteria for the organization's
   software products.  For example, these indicators include

   o  Completeness -  based on traceability of the requirements to the
      code, which is essential for maintainability.
   o  Correctness - a process for reasoning about the logic and
      relationships of data and operations is based on the clear
      specification of intended function and its faithful elaboration in
      code, essential for reliability and availability [13].
   o Style - based on consistency of recording, essential for
      maintainability.
4. Forms and reports provide uniformity in recording issues at all
   software inspections, reporting the results to management, and
   building a database useful in process management.

Conclusion

Organizations that aspire to deliver software products that produce the
right answers on time every time will benefit from software inspections.
Organizations that wish to manage commitments and perform predictably
will find that software inspections help eliminate the chaotic impact of
software defects encountered during testing and field operations.
Organizations that wish to establish a culture of fact-based software
management will find that software inspections supply important
measurements and metrics.  Although organizational commitment is needed
to initiate the roll out of software inspections, the measured benefits
they deliver provide the fuel for their sustained use.

About the Author

Don O'Neill is an experienced software engineering manager and
technologist currently serving as an independent consultant.  Following
27 years with IBM's Federal Systems Division, he completed a three-year
residency at Carnegie Mellon University's Software Engineering Institute
under IBM's Technical Academic Career Program.  As an independent
consultant, he conducts defined programs to manage strategic software
improvement, including implementation of organizational software
inspections, process, implementing software risk management, and
software competitiveness assessments.  He served on the executive board
of the Institute of Electrical and Electronic Engineers (IEEE) Software
Engineering Technical Committee and as a distinguished visitor of the
IEEE.  He is a founding member of the National Software Council and the
Washington, D.C. Software Process Improvement Network.

Don O'Neill
Independent Consultant
9305 Kobe Way
Gaithersburg, MD 20879

Voice: 301-990-0377
Fax: 301-670-0234
E-mail: ONeillDon@aol.com

References

 1. Fagan, M., "Design and Code Inspections to Reduce Errors in Program
    Development," IBM Systems Journal, Vol. 15, No. 3, 1976, pp. 182-
    211.
 2. Fagan, M., "Advances in Software Inspections," IEEE Transactions on
    Software Engineering, Vol. 12, No.7, 1987.
 3. O'Neill, Don, Peer Reviews Key Process Area Handbook, 1996.
 4. O'Neill, Don, "National Software Quality Experiment: Results 1992-
    1995," Proceeding of Software Technology Conference, Salt Lake City,
    1995 and 1996.
 5. O'Neill, Don, Software Inspections Course and Lab, Software
    Engineering Institute, 1989.
 6. Paulk, Mark C., The Capability Maturity Model: Guidelines for
    Improving the Software Process, Addison-Wesley, Reading, Mass.,
    1995.
 7. Humphrey, Watts S., Managing the Software Process, Addison-Wesley,
    1989.
 8. McCabe, Thomas J. and Arthur H. Watson, "Software Complexity,"
    Crosstalk, STSC, Hill AFB, Utah, December 1994, pp. 5-9.
 9. Ebenau, Robert G. and Susan H. Strauss, Software Inspection Process,
    McGraw-Hill, 1994.
10. O'Neill, Don and Albert L. Ingram, "Software Inspections Tutorial,"
    Software Engineering Institute Technical Review, 1988.
11. O'Neill, Don, "Software Inspections: More Than a Hunt for Errors,"
    Crosstalk, STSC, Hill AFB, Utah, January 1992.
12. Freedman, D.P., G.M. Weinberg, Handbook of Walkthroughs,
    Inspections, and Technical Reviews, Dorset House Publishing Co.,
    Inc., 1990.
13. Linger, R.C., H.D. Mills, and B.I. Witt, Structured Programming:
    Theory and Practice, Addison-Wesley, Reading, Mass., 1979.

Editors Note: This article originally appeared in CrossTalk, published
by the USAF Software Technology Center, Hill AFB, Utah.

========================================================================

                  TestWorks INTERNATIONAL DISTRIBUTORS

If you reside in Europe you have the option of contacting one of our
European distributors for information and support about TestWorks.

BENELUX (Netherlands, Belgium, Luxembourgh):

Mr. Herbert Weustenenk                  Phone: +[31] (20) 695-5511
PVI Precision Software B.V.               FAX: +[31] (20) 695-1761
Nienoord 2ot 45                         Email: 100334.315@compuserve.com
1112 KG Diemen NETHERLANDS

BELGIUM:

Mr. Jaques Dewulf                       Phone: +[32] (9) 221.03.83
Marketing Manager                         FAX: +[32] (9) 220.31.91
E2S n.v. Software Engineering           Email: jdw@e2s.be
Technologiepark 5
Zwinjnaarde BELGIUM B-9052

FRANCE:

Mr. Gilles Paulot                       Phone: +[33] (1) 47.69.58.03
Precision Software France                 FAX: +[33]   (1) 47.69.58.02
146, Boulevard de Valmy                 Email: 100321.2436@compuserve.com
92700 Colombes FRANCE

GERMANY, AUSTRIA, SWITZERLAND

Dr. Gottfried Horlacher                 Phone: +[49] (0) 6103-5847-0
Precision Software GmbH                   FAX: +[49] (0) 6103-936167
Robert-Bosch Str. 5 C                   Email: gottfried@precision.DE
D-63303 Dreieich GERMANY                  WWW: http://www.precision.DE

ITALY

Mr. Michele Giordano                    Phone: +[39] (11) 225-2211
SLIGOS S.p.A.                             FAX: +[39] (11) 220-2662
Via Vaninetti, 27                       Email: giordano@sligos.IT
10148 Torino, ITALY

Mr. Alberto Ciriani                     Phone: +[39] (2) 69.00.72.20
MetriQs, s.r.l.                           FAX: +[39] (2) 69.00.27.67
Via Alserio, 22                         Email: aciriani@metriqs.com
20159 Milano, ITALY

Dr. Antonio Serra                       Phone: + [39] 11 437 1755
ASIC s.r.l.                               FAX: + [39] 11 437 1916
V. Stefano Clemente 6                   Email: asic@ns.sinet.IT
10143 Torino, ITALY

SCANDINAVIA (Norway, Sweden, Denmark, Finland):

Mr. Bo Larson                           Phone: + [46] 8-705-7600
Memory Data AB                            FAX: + [46] 8-705-7601
Dalvagen 16                             Email: bla@memory.SE
Box 1467                                  WWW: http://www.memory.se
S-17128 Solna, SWEDEN

SPAIN:

Mr. Juan Ferrer                         Phone: + [34] 1-804-1100
SIP                                       FAX: + [34] 1-804-1400
Ronda de Poniente, 6                    Email: jmferrer@sip.es
Madrid, SPAIN  28760

UNITED KINGDOM (England, Scotland, N. Ireland):

Dr. Alan Hall                           Phone: + [44] 1 (293) 403.636
Scientific Computers, Ltd.                FAX: + [44] 1 (293) 403.641
3 Premiere House, Betts Way             Email: alan@scl.com
London Road, Crawley, West Sussex         WWW: http://www.scl.com
ENGLAND RH10 2GB

========================================================================

             EVALUATING TTN-ONLINE:  GIVE US YOUR COMMENTS

TTN-Online is free and aims to be of service to the larger software
quality and testing community.  To better our efforts we need YOUR
FEEDBACK!

Please take a minute and E-mail us your thoughts about TTN-Online?

Is there enough technical content?

Are there too many or too few paper calls and conference announcements?

Is there not enough current-events information? Too much?

What changes to TTN-Online would you like to see?

We thrive on feedback and appreciate any comments you have.  Simply
address your remarks by E-mail to "ttn@soft.com".

========================================================================

          TTN Online Edition -- Mailing List Policy Statement

Some subscribers have asked us to prepare a short statement outlining
our policy on use of E-mail addresses of TTN-Online subscribers.  This
issue, and several other related issues about TTN-Online, are available
in our "Mailing List Policy" statement.  For a copy, send E-mail to
ttn@soft.com and include the word "policy" in the body of the E-mail.

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is E-mailed the 15th of each month to
subscribers worldwide.  To have your event listed in an upcoming issue
E-mail a complete description and full details of your Call for Papers
or Call for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).  Longer articles are OK and may be serialized.
o  Length of submitted calendar items should not exceed 60 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters and TTN-Online disclaims any responsibility for their
content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-ONLINE, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, send E-mail to "ttn@soft.com".

TO SUBSCRIBE: Include in the body of your letter the phrase "subscribe
".

TO UNSUBSCRIBE: Include in the body of your letter the phrase
"unsubscribe ".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA

                   Phone:          +1 (415) 550-3020
                   Toll Free:      +1 (800) 942-SOFT (USA Only)
                   FAX:            +1 (415) 550-3030
                   E-mail:         ttn@soft.com
                   WWW URL:        http://www.soft.com

                               ## End ##