sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======              July 1997              =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), Online Edition, is E-mailed monthly
to support the Software Research, Inc. (SR)/TestWorks user community and
to provide information of general use to the worldwide software quality
and community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of TTN-Online provided that the
entire document/file is kept intact and this complete copyright notice
appears with it in all copies.  (c) Copyright 1997 by Software Research,
Inc.

========================================================================

INSIDE THIS ISSUE:

   o  Nightmare of the Month Club:  A Winner (submitted by Ken McDonnell
      from Melbourne, Australia)

   o  Call for Participation: Quality Week Europe (QWE'97) [7-10
      November 1997]

   o  Quality Grades for Software Component Source Code Packages
      (Message by Frank Ackerman, Group Chairperson)

   o  Your Best Testing Dream:  The Contest!

   o  The Quality Approach: Is It Delivering?  by Christopher Fox and
      William Frakes (Special C.ACM Reprint)

   o  SR In Software Magazine Top 500 List

   o  TestWorks Applied to the Y2K Problem

   o  Assurance and Development of Safety-Related Computer-Based Systems
      (Seminar Outline)

   o  TTN-Online -- Mailing List Policy Statement

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

                Nightmare of the Month Club:  A WINNER!

          $50 goes to Ken McDonnell from Melbourne, Australia

A compiler developer was soliciting feedback from researcher colleagues
on a new beta version of a compiler.  Availability of the latest beta
version was announced via e-mail, and positive feedback flowed in for a
couple of days.  Then there were a spate of errors reported with the
compiler aborting on several unrelated source files.

The compiler developer collected some of the source files but could not
reproduce the problem.  The bug reports stopped arriving.  A few days
later there were another burst of bug reports.

This pattern went on for several weeks.

Finally a trend emerged.  The bugs were observed only on Wednesday, but
most users worked late into the evening and the compiler developer did
not try and reproduce the problems until the following morning, and by
Thursday the bug had gone away.

Hmm, what makes Wednesday special? ...

        Monday
        Tuesday
        Wednesday
        Thursday
        Friday
        Saturday
        Sunday

Ah!  "Wednesday" has the longest name of any weekday.  There was an
"off-by-one" error in the buffer allocation for the system's date
conversion routine, and if the user asked for a compiler listing (this
was optional) then dates appeared on the page headings and the compiler
aborted, but only if run on a Wednesday.

========================================================================

              C A L L   F O R   P A R T I C I P A T I O N

         1st Annual International Software Quality Week/Europe

                           4-7 November 1997

                   Sheraton Hotel, Brussels, Belgium

Quality Week Europe is the first European edition of a continuing series
of conferences focusing on advances in software test technology, quality
control, risk management, software safety, and test automation.
Software analysis methodologies, supported by advanced automated
software test methods, promise major advances in system quality and
reliability, assuring continued competitiveness.  QWE'97 papers are
reviewed and selected by a distinguished International Advisory Board:

The QWE'97 advisory board is 2/3 European and 1/3 USA:

   Boris Beizer, Analysis, USA
   Bill Bently, Bayer Corporation, USA
   Fevzi Belli, University of Paderborn, Germany
   Gilles Bernot, Universite d'Evry, France
   Antonia Bertolino, IEI/CNR , Italy

   Robert Binder, RBSC, Inc. USA
   Juris Borzovs, Riga University, Latvia,
   Rita Bral, SR/Institute, USA
   Gunther Chrobok, DLR, Germany
   Ann Combelles, Objectif France

   Dirk Craeynest, OFFIS nv/sa & K.U.Leuven, Belgium
   Tom Drake, BAH Consulting, USA
   John Favaro, Intecs Sistemi, Italy,
   Istvan Forgacs, National Academy of Sciences, Hungary
   Mario Fusani, IEI/CNR, Italy

   Monica Gonauser, Siemens Germany,
   Hans Ludwig Hausen, GMD, Germany
   William Howden, UC/San Diego, USA
   Guenter Koch, Synologic, AG, Switzerland
   Peter Liggesmeyer, Siemens,  Germany

   Cecilie B. Loken, Ericsson, Norway
   Edward Miller, SR/Institute, USA (General Chair)
   John Musa, Consultant, USA
   Lee Osterweil, Univ. Mass, USA
   Tom Ostrand, Siemens Corporate Research, USA

   Antonia Serra, Metriqs, Milano, Italy
   Eric vanVeenendaal, KEMA and TUE, Netherlands
   Otto Vinter, Brel & Kjaer, Denmark
   Pierre Wolper, University of Liege, Belgium
   Jim Woodcock, Oxford University, England

The mission of the QWE'97 Conference is to increase awareness of the
importance of Software Quality and the methods used to achieve it.  It
seeks to promote Software Quality by providing technological education
and opportunities for information exchange within the software
community.

CONFERENCE THEME: Quality for the Millenium

The QWE'97 theme, Quality for the Millenium, will focus attention not
only on the year 2000 problem, but also on such important changes as the
Internet, Electronic Commerce, Client/Server and OO testing, and related
quality areas.

QWE'97 OFFERS:

The QWE'97 program consists of four days of mini-tutorials, panels,
technical papers and workshops that focus on software test automation
and new technology. QWE'97 provides the Software Testing and QA/QC
community with:

*   Quality Assurance and Test involvement in the development process
*   Exchange of experience-based information among technologists
*   State-of-the-art information on software quality test methods
*   Analysis of effectiveness through case studies
*   Vendor Technical Presentations
*   Vendor Show & Exhibits

IMPORTANT DATES:

Abstracts and Proposals Due: 30 July 1997

Notification of Participation: 29 August 1997

Camera Ready Materials Due: 19 September 1997

FINAL PAPER LENGTH:  10 - 20 pages, including Slides / View Graphs

We are soliciting 45- and 90- minute presentations or participation in
panel discussions on any area of testing and automation, including:

    New and Novel Test Methods
    Automated Inspection
    CASE/CAST Technology
    Client-Server Computing
    Cost / Schedule Estimation
    CMM
    Data Flow Testing
    Defect Tracking / Monitoring
    Function Point Testing
    GUI Test Technology
    Integrated Environments
    ISO-9000
    Load Generation & Analysis
    Multi-Threaded Systems
    Object Oriented Testing
    Process Assessment / Improvement
    Productivity and Quality Issues
    Re-Use
    Real-World Experience
    Real-Time Software
    Reliability Studies
    Risk Management
    Software Metrics in Test Planning
    Test Automation
    Test Data Generation
    Test Documentation Standards
    Test Management Automation
    Test Planning Methods
    Test Policies and Standards

SUBMISSION INFORMATION:

Abstracts should be 2 - 4 pages long, with enough detail to give
reviewers an understanding of the final paper, including a rough outline
of its contents.  Indicate if the most likely audience is technical,
managerial, or application- oriented.

In addition, please include:
*   A cover page with the paper title, complete mailing and e-mail
    address(es), and telephone and FAX number(s) of each author.
*   A list of keywords / phrases describing the paper.
*   A brief biographical sketch of each author.

Send abstracts to:

Ms. Rita Bral
Software Research Institute
901 Minnesota Street
San Francisco, CA 94107 USA USA.


========================================================================

      QUALITY GRADES FOR SOFTWARE COMPONENT SOURCE CODE PACKAGES
             IEEE SOFTWARE ENGINEERING STANDARD STUDY GROUP

              MESSAGE TO INTERESTED PARTIES (6 July 1997)

                          Group Chairperson:

                           A. Frank Ackerman
              (a.f.ackerman@ieee.org or ackerman@soft.com)

This message relates to the following notice that was recently posted to
a couple of news groups:

At its meeting in Walnut Creek, CA on June 1 the IEEE Software
Engineering Standards Committee approved a proposal to begin work on an
IEEE Software Engineering standard that would define a sequence of
QUALITY GRADES for software component source code packages.

Such a standard would define a nested sequence of characteristics for a
source code package that would permit the component to be assessed for
reliability, and the ability to maintain reliability during future
enhancements.

Work on this standard will probably take place entirely on the net.
Please contact me if (1) you would like to participate in developing
this standard, or (2) you can suggest other places that this notice
should be posted.

An initial roster of Contributing Participants for this standard has
been created and the first version of the Discussion Log emailed to that
roster.  In addition a roster of Interested parties has been set up.
Your email address is on that roster.

If you do nothing on receipt of this message you will from time-to-time
receive brief messages on the progress of this study/working group.

Other actions you may wish to take are:

1. Reply to me to delete your name from our mailing list.

2. Reply to me that you want to be moved to the Contributing
Participants list.

3. Pass this message on to whoever may be interested in contributing to
the development of this standard, or would like to be kept informed of
progress on this standard.

4. To receive an email Information Packet on this effort reply with the
subject:

        Std Info Pkg Request

========================================================================

                 YOUR BEST TESTING DREAM: THE CONTEST!

We know there must be a million nightmarish stories about "what happened
to me when I tried to test..." out there.  But for every testing
nightmare there's probably a DREAM STORY that bests the worst nightmare
by a wide margin.  We think a lot of you would find comfort in comparing
your own dream about how well software testing went with those others
write about.  Or, putting it another way, happiness begets happiness!

Here is the contest in a nutshell:  Submit a short writeup (try to keep
it under 250 words) of your BEST software testing dream.  It doesn't
even have to to be real.  We'll pick one or more submitted dream stories
per month and publish them in TTN-Online.

The Prize:  You get the comfort of sleeping better knowing you gave
someone else something to be happy about.  And, we will send a check for
$50 to the dream voted by our staff the best one from those we receive.
Remember, you can't win the $50 unless you enter!

========================================================================

                The Quality Approach: Is It Delivering?

                                   By

                            Christopher Fox
                        James Madison University

                             William Frakes
                             Virginia Tech

      (c) 1997 Association for Computing Machinery, Inc. This
      article is reprinted from the June 1997 issue of the
      Communications of the ACM and is reprinted here by
      permission.  It is the introduction to the special issue and
      the other papers referred to are in C.ACM for June 1997.

Some companies today are reporting increases in software productivity
and quality as high as one thousand percent using quality approaches
pioneered in other industries. The goals of this special section are to
offer guidance to those interested in learning about these quality
improvement approaches in software development, and to examine the
payoff of software quality improvement efforts based on recent
experiences.  Although the concept of quality is fairly simple, (see the
sidebar "Elements of the Quality Paradigm", below), the complex of
methods, tools, attitudes, and values involved in providing high quality
products and services is not. Many software professionals have found it
difficult to understand and accept approaches like Total Quality
Management (TQM), and then difficult to operationalize such approaches
in their software development environments [3]. Approaches to achieving
high quality like TQM are difficult to understand because they are not
simply toolkits, methods, or management theories, but part of a
complicated and encompassing quality paradigm. Furthermore, part of the
difficulty in operationalizing the quality paradigm for software
development is that the quality paradigm was developed for, and has had
its greatest successes in, manufacturing industries.

The concept of a paradigm was introduced by Thomas Kuhn to explain the
birth and growth of scientific disciplines [7]. Kuhn says that paradigms
are broad and often fuzzy collections of ideas, attitudes, values,
tools, techniques, theories, approaches to solving problems, and example
problem solutions, that characterize a scientific discipline and
regulate the behavior of its practitioners. Paradigms have the following
essential properties:

   - paradigms are difficult to explain, and difficult for individuals
     outside the paradigm to understand;

   - paradigms cannot be proved or disproved by a single crucial
     experiment, but must be judged by accumulated evidence;

   - paradigms tend to be championed by a small group in the face of
     opposition from the larger community, and gain ground very slowly
     until the entire community adopts the new paradigm in a scientific
     revolution.

Kuhn mentions many examples of paradigms, such as the paradigm of
Newtonian mechanics that replaced the Aristotelian paradigm that
preceded it, and which was in turn replaced by the relativistic
paradigm.

How do these characteristics of a paradigm apply to approaches to
quality?

If the way to achieve high quality products and services is not simply
to apply a particular method, technique, tool, or theory, but to adopt a
whole paradigm that includes all these things (along with values,
attitudes, model problem solutions, and so forth), then it is not
surprising that it is difficult to explain, and hard to understand.
Efforts to improve quality by applying pieces of the paradigm as cook-
book solutions will fail -- adopting a paradigm requires much learning
along with deep changes in perception, attitudes, and actions.

Furthermore, adopting the quality paradigm will be impossible to justify
to those expecting a simple cost-benefit analysis of the sort applied to
buying a new tool or using a new method. The costs of the profound
changes involved in adopting a new paradigm, and their benefits, are
difficult or impossible to predict for a given organization. The costs
and benefits of a new paradigm are clear only after the paradigm has
been tried out for a while (we discuss this important point further
below).

It is hard to achieve higher quality because it requires much effort and
time to absorb and apply a new paradigm. In fact, it is likely that
initial results will be disappointing while the paradigm is being
absorbed, and that positive results will not accrue for several years.
Finally, the success of the quality approach in other industries, like
the manufacturing and service industries, offers hope but not detailed
guidance, because the general approaches and philosophies of the quality
paradigm must be operationalized for software development.

If adopting the quality paradigm is so difficult, why should the
software industry expend effort or time on it? New paradigms replace old
ones when progress is blocked. The quality paradigm has been adopted in
many troubled industries, such as the automobile and consumer
electronics industries, where it has improved profits, market share,
cycle times, customer satisfaction, and so on. The software industry is
in a seemingly endless "software crisis," competition is increasing, and
there are ever louder calls for software production to achieve the
standards of quality, reliability, and productivity characteristic of
other engineering disciplines.

Perhaps the quality paradigm can help. The difficulties of
operationalizing quality principles for software production are
significant, however. It is difficult, for example, for a programmer to
see how to apply the quality paradigm to writing code, or for a manager
to see how to apply standard quality tools, like control charts, to the
software production process.  The work of the Software Engineering
Institute (SEI) on the Capability Maturity Model (CMM) has had enormous
impact on bridging the gap between quality paradigm elements and
software engineering practice. Although structured as a software process
maturity framework derived from Crosby's

quality management maturity grid [9 p.11], the CMM incorporates all
elements of the quality paradigm and provides details about how to apply
them in developing software. Other writers have shown how to apply the
quality paradigm to software development using other frameworks such as
TQM, quality standards, and quality award competition guidelines.  The
sidebar (below) "Quality Roadmap: Key Readings and Resources in Software
Quality" lists references about both the quality paradigm and how to
apply it to software development.

What evidence is there that the quality paradigm can be effective in the
software industry? Ideally this question could be answered by designing
one or two crucial experiments, running them carefully to obtain
empirically justified results, and replicating them to ensure the
validity of the conclusions. Unfortunately, as Kuhn points out,
paradigms cannot be evaluated on the basis of crucial experiments alone.
A new paradigm must be tried out over a period of time until a
preponderance of evidence shows that it does or does not provide a
better framework for understanding, explanation, and problem solving
than its predecessor.

The software industry has been trying out the quality paradigm for a few
years now, and evidence is beginning to accumulate. With occasional
caveats, all the published evidence is positive. Many negative anecdotes
and war stories about quality paradigm failures are heard, but few are
published, so the record may be biased. Organizations that have given
the quality paradigm a fair shake and have ultimately rejected it (if
there really are any) need to publish their experiences to ensure a
balanced evaluation.  The published evidence is summarized as follows.

  *  IBM's Santa Theresa Labs began an extensive quality improvement
     effort in 1989. By 1993, field defects for six lead products had
     decreased by 46%, and lab service costs had decreased by 20%.
     During the same period, when revenues were declining for IBM as a
     whole, revenues per employee at the Santa Theresa labs increased by
     58%, and customer satisfaction increased by 14% [6].

  *  Hughes Aircraft moved from an assessed SEI CMM level two in 1987 to
     level three in 1990 after improving its development process based
     on the findings of its first assessment. Hughes estimates that
     implementing these recommendations achieved about a 5 to 1 return
     on investment [5].

  *  Raytheon Corporation reports that its process improvement efforts
     achieved a 7.7 to 1 return on investment, doubled productivity,
     decreased rework costs by 75% saving $15.8 million, and moved the
     organization from level one to level three on the SEI CMM scale
     [2].

  *  A more recent report from Raytheon summarizes eight years of
     progress in software quality improvement whose net results include:
     a decrease in the cost of quality from 41% to 21% of project costs,
     a 190% increase in productivity, and a decrease in product trouble
     report rates from 17.2 to 4.0 per thousand lines of delivered code
     [4].

  *  The Oklahoma Air Logistics Center (part of the US Air Force)
     contracted an independent appraisal of the benefits of their CMM-
     based process improvement efforts. The contractor reports a 7.5 to
     1 return on investment, a 90% reduction in defect rates along with
     a tenfold increase in productivity between the first and the last
     of four projects, and a 26% reduction in the average cost of a
     maintenance activity [1].

  *  In six years, Hewlett-Packard reduced delivered software defects
     from 1.0 to 0.1 per thousand lines of source code, and saved $100
     million through process improvement [8].

  *  CMM-like process improvement efforts at Schlumberger have produced
     a variety of positive results, such as improving schedule adherence
     rates from 50% in 1990 to 99% in 1992, while reducing delivered
     defect rates from 0.22 per thousand lines of code to 0.13 per
     thousand lines of code [10].

This special section of Communications adds more empirical evidence
about whether the quality paradigm works in software development. Like
those summarized above, all the reports below are positive:

*  Studies by the SEI on the effectiveness of the CMM show that process
   maturity confers business benefits with few of the drawbacks that
   some had predicted. In one study median productivity gains were 35%
   per year, median post-release defect reductions were 39% per year,
   and the median ratio of benefits to cost was 5 to 1. A second study
   shows that higher-maturity organizations are better able to meet
   schedules and budgets, and have better productivity, product quality,
   and staff morale [Herbsleb et. al., this issue].

*  Through the use of a library of 120 reusable processes, PRC
   Incorporated reduced the development time for project-specific
   processes by a factor of 10 to 1. TQM methods at PRC allowed a
   business unit to move from level one to level three on the CMM scale
   in 39 months, where 55 months is the industry average [Hollenbach et.
   al., this issue]

*  Using TQM, US West reduced service outages by 79% in one
   organization, and in another reduced billing postage costs by 20-30
   million dollars, reduced billing cycle time by one to two days, and
   reduced service order errors by 50% [Arthur this issue].  Further
   evidence should accumulate in the next few years as more
   organizations try out the quality paradigm, especially process
   improvement based on the SEI's CMM. All the evidence currently in the
   literature is from American organizations. Evidence from Europe and
   Japan may soon appear as the process improvement focus pioneered by
   the SEI is pursued there under the ISO's Software Process Improvement
   and Capability dEtermination (SPICE) project [9].

In summary, the quality paradigm is beginning to be more widely
understood, accepted, and applied by software developers. Evidence from
early adopters of the quality paradigm supports the claim that it can
significantly improve return on investment, customer satisfaction,
productivity, and other important measures of software development
success.

References

1. Butler, Kelly L. The economic benefits of software process
   improvement.  Crosstalk, (July 1995), 14-17.

2. Dion, Raymond. Process improvement and the corporate balance sheet.
   IEEE Software 10, 4 (July 1993), 28-35.

3. Gruman, Galen. Management cited in bleak SQA survey. IEEE Software 5,
   3 (May 1988), 102-103.

4. Haley, Thomas J. Software process improvement at Raytheon. IEEE
   Software 13, 6 (November 1996), 33-41.

5. Humphrey, Watts, Snyder, Terry R. and Willis, Ronald R. Software
   process improvement at Hughes Aircraft. IEEE Software 8, 4 (July
   1991), 11-23.

6. Kaplan, Craig, Clark, Ralph and Tang, Victor. Secrets of Software
   Quality: 40 Innovations from IBM. McGraw-Hill, New York, 1994.

7. Kuhn, Thomas. The Structure of Scientific Revolutions, 2nd Edition.
   University of Chicago Press, Chicago, 1970.

8. Myers, Ware. Hard data will lead managers to quality. IEEE Software
   11, 2 (March 1994), 100-101.

9. Paulk, M. C. and Konrad, M. D. An overview of ISO's SPICE project.
   American Programmer 7, 2 (February 1994), 16-20.

10. Wohlwend, Harvey, and Rosenbaum, Susan. Schlumberger's software
   improvement program. IEEE Transactions on Software Engineering 20, 11
   (November 1994), 833-839.

   = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

              Sidebar 1: Elements of the Quality Paradigm

The amorphous nature of the quality paradigm makes it difficult to
describe, and different authors tend to emphasize different aspects. The
quality paradigm has the following major elements.

The Nature of Quality:

   When the quality paradigm was forming early in this century, emphasis
   was still on inspection to achieve quality, so "quality" was
   originally understood to mean conformance to a standard or a
   specification. As the quality paradigm matured and the importance of
   satisfying customers was better appreciated, new characterizations of
   quality were adopted, such as fitness for use. Although these two
   characterizations are still common, recent stress on a rapidly
   changing, highly competitive business environment has led to a new
   definition of quality that stresses the many features of a product or
   service needed to meet the evolving needs of customers. For example,
   the IEEE standard defines quality as "the totality of features and
   characteristics of a product or service that bears on its ability to
   satisfy given needs" [3].

The Process Perspective:

   One of the fundamentally different ways of seeing and solving
   problems in the quality paradigm is to focus on processes rather than
   products, services, mistakes, or errors. The quality paradigm
   includes guidance for recognizing, defining, measuring, analyzing,
   and improving processes. A central value of the quality paradigm is
   that processes must not only be improved, but they must be
   continuously improved.  A simple and well known model for continuous
   process improvement is the Deming cycle, or plan-do-check-act (PDCA)
   cycle, consisting of four activities that are repeated to achieve
   ever higher levels of quality [2].

   Recent interest in the software engineering community in software
   processes strongly reflects this aspect of the quality paradigm. In
   particular, the Software Engineering Institute's Capability Maturity
   Model is centered on an organization's ability to follow, define,
   manage, and improve its processes. The SEI process work is based on
   earlier work by Deming [2] and Juran [4], and the maturity framework
   is based on a similar idea proposed by Crosby [1].

Being Data Driven:

   The quality paradigm is fundamentally an empirical approach to
   problem solving and management, and so it is based on collecting and
   analyzing data. There are heuristics for deciding what data to
   collect, tools for exploring data and using it for problem solving
   and decision making, such as Pareto diagrams, and methods for using

   data to manage and control quality, like statistical process control.
   Benchmarking is a method of quality improvement based on the analysis
   of current best practices. An organization wishing to reduce its code
   defect densities, for example, might try to find those organizations
   having very

   low defect rates and studying their methods for achieving them.
   Quality awards like the Malcolm Baldrige National Quality Award also
   provide a mechanism for organizations to benchmark their quality
   systems using the award's scoring system.

Customer Focus:

   Making the customer the ultimate judge of performance, the driver of
   business decisions, and the focus of everyone's attention is both a
   value and a management technique in the quality paradigm. The idea of
   customer focus is also applied inside an organization by asking every
   individual to identify and work with his or her (internal) customer
   to improve quality and productivity.  Defect Elimination Techniques
   for defect elimination fall into two broad classes: defect detection
   and removal techniques search out and remove defects after a product
   or service is produced; defect prevention techniques change processes
   so they produce fewer defects in products or services.

   A basic fact exploited by the quality paradigm is that preventing
   defects is more efficient, economical, and effective than detecting
   and removing them once they have occurred.  Defect detection and
   removal techniques include the activities that dominate software
   validation and verification:  inspection and repair, and test and
   debug. By far the most widely practiced defect elimination technique
   in software development and maintenance is testing and debugging.
   More sophisticated organizations also rely extensively on inspection
   and repair. Some software organizations are turning more to defect
   prevention techniques, such as better tools, methods, and training,
   and improved or reengineered processes.

Managing for Quality:

   Adopting the quality paradigm, and continuously improving processes,
   both require change, which people naturally resist.  Attempts to
   improve quality must be championed by management or they will fail.
   The quality paradigm offers many ways that a management commitment to
   quality can penetrate an organization.

References

1. Crosby, Philip B. Quality is Free: The Art of Making Quality Certain.
   Mentor Books, New York, 1980.

2. Deming, W. Edwards. Out of the Crisis. MIT Press, Cambridge, 1986.

3. IEEE. Software Engineering Standards, Third Edition. IEEE, New York,
   1989.

4. Juran, J. Quality Control Handbook, 4th Edition. McGraw-Hill, New
   York, 1988.

   = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Sidebar 2: Quality Roadmap Key Readings and Resources in Software Quality

The Quality Paradigm:

   AT&T. Statistical Quality Control Handbook. AT&T, New York, 1956.

   Crosby, Philip B. Quality is Free: The Art of Making Quality Certain.
   Mentor Books, New York, 1980.

   Deming, W. Edwards. Out of the Crisis. MIT Press, Cambridge, 1986.

   Imai, Masaaki. Kaizen: The Key to Japan's Competitive Success. Random
   House, New York, 1986.

   Juran, J. Quality Control Handbook, 4th Edition. McGraw-Hill, New
   York, 1988.

Applications of the Quality Paradigm to Software

   Arthur, Lowell Jay. Improving Software Quality: An Insiders Guide to
   TQM. Wiley, New York, 1993.

   Glass, Robert L. Building Quality Software. Prentice Hall, Englewood
   Cliffs, 1992.

   Kaplan, Craig, Clark, Ralph and Tang, Victor. Secrets of Software
   Quality: 40 Innovations from IBM. McGraw-Hill, New York, 1994.

   Oskarsson, O. and Glass, R. L. An ISO 9000 Approach to Building
   Quality Software. Prentice Hall, Upper Saddle River, NJ, 1996.

Software Process Improvement

   Humphrey, Watts. A Discipline of Software Engineering. Addison-
   Wesley, Reading, 1995.

   Humphrey, Watts. Managing the Software Process. Addison-Wesley,
   Reading, 1989.

   Paulk, M. C. and Konrad, M. D. An overview of ISO's SPICE project.
   American Programmer 7, 2 (February 1994), 16-20.

   Software Engineering Institute. The Capability Maturity Model:
   Guidelines for Improving the Software Process. Addison-Wesley,
   Reading, 1995.

Conferences and Workshops

   International Conference on Software Quality.
   wwwsel.iit.nrc.ca/6ICSQ.

   International Software Quality Week. www.soft.com/QualWeek.

   Pacific Northwest Software Quality Conference.
   www.teleport.com/~pnsqc.

Quality and Standards Organizations

   American Society for Quality Control (ASQC): www.asqc.org.

   International Organization for Standardization (ISO):
   www.iso.ch/welcome.html

   National Institute for Standards and Technology (NIST) (Administrator
   of the Malcolm Baldrige National Quality Award):
   www.nist.gov/director/quality_program.

========================================================================

             SR In Software Magazine Top 500 List For 1997

Software Magazine (published by the Sentry Technology Group) aas
included SR in the 1997 Software 500 (July 1997 issue).  Marking the
15th consecutive year of publishing their ranking of the "...best,
brightest and most successful [companies] in the software industry", the
Software 500 include "...solid providers of enterprise-level
software...".  SR is proud to be included in the 1997 Software Magazine
Top 500 List.

========================================================================

                  TestWorks Applied to the Y2K Problem

Many people have asked us how we recommend how to apply TestWorks
technology to the "Year 2000 (Y2K)" problem, so here is a very short
overview and summary of how this works.

We see the Y2K problem as composed of five separate steps (explained
below with an approximate estimation of the proportion of total effort
needed for the job):

1. Plan (5%): Figure out what to do and schedule resources to get the
   job done.

   One can use much conventional program management tools -- and there
   are certainly plenty of them around! -- to carefully plan the work
   effort.

2. Identification (10%):  Find the places in the code tree where
   instances of date dependence happen and ferret out what other parts
   of the code tree depend on these passages.

   The tools here involve local syntactic and partial semantic analysis
   of the source code tree for specific date references, or date
   dependence references.  This step need not be highly automated and
   most of the time you can use conventional tools you already have at
   hand.  In other words, finding what to fix involves careful and
   rigorous use of the search functions on program editors combined with
   careful record-keeping.  The main risk here is that a date reference
   or a date dependency that OUGHT to be remedied (modified) is missed.

3. Remediation (25%):  Make the changes to the code tree to assure that
   the NEW software will not have the Y2K problem.

   The programmers love this step and one hopes that the new passages of
   code are "better written and more clearly documented" than the ones
   being replaced.  It is probably good practice to keep the "old code"
   present but commented out -- just as a reference.

4. Confirmation/Testing (30%):  Confirm that the modifications are
   functioning properly.

   The task here is to create a reproducible record that demonstrates
   that the modified program works correctly across the year-2000
   boundary.

   TestWorks helps with use of our regression products: creating tests
   that show how the remediated (modified) program works (using CAPBAK)
   and sets of tests are stashed in a test tree (using SMARTS).

5. Verification (30%):  Verify that the tests used in (4) really *do*
   exercise all of the modified code and, as well, that any code that
   depends on the modified code has also been covered by some dynamic
   test.

   TestWorks helps there through application of the TCAT coverage
   analyzers, available for C, C++, Ada, F77, and COBOL.  TCAT for Java
   is also available, but we don't expect to see much Java code that has
   a Y2K problem!

   The verification step's main value is to find out what you didn't
   test so that you can add additional cases to your test suite (in 4
   above).  A good target is 90%+ C1 (branch coverage) if you want high
   confidence in the quality of the change.

Sounds simple, of course, but obviously this is a very simplified
description.  The point is, the main elements of work are well supported
by tools to do the hardest parts -- the parts that don't have to be done
manually.

For information about TestWorks and the Y2K problem, and for a copy of
the Y2K Problem Application Note, Email to info@soft.com.

========================================================================

                              Peter Bishop

              Adelard Seminar, 9 July 1997, Manchester, UK

"The Assurance and Development of Safety-Related Computer-Based Systems"

   Editors Note: Though the date for this event has obviously passed,
   (we didn't receive the information in time for the last issue) we
   through out readers would be interested in the subject matter and
   how the material is organized.

   The seminar will present different facets of the company's work on
   safety assurance. There will be formal presentations (listed in
   the programme below) together with opportunities to see tool
   demonstrations, browse the Adelard resource centre, and have
   informal discussions with our consultants.

Program

10:00   Registration and coffee

10:20   Introduction and welcome

10:30   Safety cases and support tools
        - safety case methodology for COTS
        - Tool support for safety case and standards
        - Adelard safety resource centre
        - Safety engineering tools

12:15   Lunch

13:45   Practical experience of safety-related software development
        - Use of formal methods in a commercial software development
        - Practical methods for statistical testing to levels 2 and 3
        - Safety case for a level 2 advisory system
        - Proof of level 4 systems for clients

15:30   Tea

15:30   Corporate memory and learning from experience
        - Overview of corporate memory issues
        - Support tools for corporate memory and learning

16:00   Project support
        - Independent safety assessment
        - Dependability programmes

16:30   End of Seminar

Contact Barbara McNamara if you are interested in attending.

   bmm@adelard.co.uk
   tel: 0181-983-1708

Note that the number of available spaces is limited

--
Peter Bishop, Adelard, 3 Coborn Rd, London E3 2DA, England
Tel  : +44-181-983-0219    Email:        pgb@adelard.co.uk
Fax  : +44-181-983-1845    Web:   http://www.adelard.co.uk

========================================================================

                         Call for Contributions

            7th Annual Ada-Belgium Seminar (Ada-Belgium'97)

                  Developing Distributed Applications

                       Friday, November 28, 1997
                       Trasys, Brussels, Belgium

    http://www.cs.kuleuven.ac.be/~dirk/ada-belgium/events/local.html

Ada-Belgium is soliciting contributions for presentation during its next
Annual Seminar, to be held at Trasys on Friday, November 28, 1997
(tentative date).  Attendees will include industry, government and
university representatives that are active and interested in Ada
software development and management.

This seventh Annual Ada-Belgium Seminar will feature tutorial, paper and
project presentations.  As the previous years, we are preparing a
program with first class invited speakers, such as John Barnes in 1994,
Robert Dewar in 1995, and Tucker Taft in 1996, and lots of free Ada-
related material, e.g. free Ada CD-ROMs in 1994, 1995 and 1996, copies
of the Ada 95 Reference Manual and Rationale in 1995, copies of the Ada
95 Quality and Style Guide in 1996, etc.

Theme of the Seminar will be "Developing Distributed Applications".

Several approaches to develop distributed applications will be presented
(Corba, DCE, Ada Distributed Systems Annex, etc.) as well as practical
experiences, available products, etc., with special emphasis on the role
of Ada 95.

Contributions consistent with the general theme of the Seminar, outlined
below, are hereby invited:

  * Longer presentations giving an overview of one of the approaches
    mentioned above.
  * Shorter experience reports of projects using or trying out one or
    more of the approaches.
  * Short technical presentations of available products.

More general contributions are also welcome, such as on:

  * Management of Ada software development projects, including the
    transition to Ada 95.
  * Experiences with Ada development, including distributed
    applications.
  * Ada technology, including Ada 95 topics.
  * Ada research projects in universities and industry.

Those interested should submit a short abstract (10-15 lines) in
English by July 31, 1997, via e-mail to: ada@belgium.eu.net

Short presentations will get a time-slot of 20-30 minutes.  For longer
presentations, the organizers will work out a schedule with the
authors.

For additional information on the Ada-Belgium'97 Seminar please contact
the Ada-Belgium Board at the e-mail address listed.

Dirk Craeynest
Ada-Belgium Board
ada@belgium.eu.net

========================================================================

             EVALUATING TTN-ONLINE:  GIVE US YOUR COMMENTS

TTN-Online is free and aims to be of service to the larger software
quality and testing community.  To better our efforts we need YOUR
FEEDBACK!

Please take a minute and E-mail us your thoughts about TTN-Online?

Is there enough technical content?

Are there too many or too few paper calls and conference announcements?

Is there not enough current-events information? Too much?

What changes to TTN-Online would you like to see?

We thrive on feedback and appreciate any comments you have.  Simply
address your remarks by E-mail to "ttn@soft.com".

========================================================================

              TTN-Online -- Mailing List Policy Statement

Some subscribers have asked us to prepare a short statement outlining
our policy on use of E-mail addresses of TTN-Online subscribers.  This
issue, and several other related issues about TTN-Online, are available
in our "Mailing List Policy" statement.  For a copy, send E-mail to
ttn@soft.com and include the word "policy" in the body of the E-mail.

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN Online Edition is E-mailed around the 15th of each month to
subscribers worldwide.  To have your event listed in an upcoming issue
E-mail a complete description and full details of your Call for Papers
or Call for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).  Longer articles are OK and may be serialized.
o  Length of submitted calendar items should not exceed 60 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters and TTN-Online disclaims any responsibility for their
content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-Online, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, send E-mail to "ttn@soft.com".

TO SUBSCRIBE: Include in the body of your letter the phrase "subscribe
".

TO UNSUBSCRIBE: Include in the body of your letter the phrase
"unsubscribe ".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA

              Phone:          +1 (415) 550-3020
              Toll Free:      +1 (800) 942-SOFT (USA Only)
              FAX:            +1 (415) 550-3030
              E-mail:         ttn@soft.com
              WWW URL:        http://www.soft.com

                               ## End ##