sss ssss      rrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr


         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======              March 1995             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the world software testing commun-
ity.

(c) Copyright 1995 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.

TRADEMARKS:  Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc.  All other systems are either
trademarks or registered trademarks of their respective companies.

========================================================================

INSIDE THIS ISSUE:

   o  CONFERENCE ANNOUNCEMENT
      EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)

   o  SPECIAL ISSUE REVIEW:
      OBJECT-ORIENTED SOFTWARE TESTING (Part 1 of 3)
      by Edward F. Miller, President, Software Research, Inc.

   o  CALL FOR PAPERS
      AQuIS'96

   o  AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
      SOFTWARE TEST DOCUMENTATION (Part 1 of 3)
      by Harry M. Sneed, Germany

   o  CALENDAR OF EVENTS

   o  TTN SUBMITTAL POLICY

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

************************************************************************
           EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)
************************************************************************

                       30 May 1995 -- 2 June 1995
            Sheraton Palace Hotel, San Francisco, California
             Conference Theme: The Client-Server Revolution

QW `95 is the premier technological conference of its kind, combining
the newest applications, technology, and management techniques. Software
Quality Week, now in its eighth year, focuses on advances in client/
server technologies, software test technology, quality control, software
test process, managing OO integration, software safety, and test automa-
tion. Quality Week `95 offers an exchange of information between academ-
icians and practitioners that no other conference can provide.

The Client/Server Revolution is sweeping all of computing, changing the
way we think about organizing complex systems, how we develop and test
those systems, and changing our approach to quality control questions
for multi-user, multi-platform, heterogeneous environments. At the same
time, the Client/Server Revolution is forcing a closer look at critical
development strategies, at how we think about software testing, and at
the methods and approaches we use to get the job done. The Eighth Inter-
national Software Quality Week covers advances in software analysis and
review technologies, along with formal methods and empirical strategies
for large-scale as well as small-scale projects. Quality Week competi-
tive edge to dominate your industry.

PROGRAM DESCRIPTION
^^^^^^^^^^^^^^^^^^^
The Pre-Conference Tutorial Day offers expert insights on ten key topic
areas.  The Keynote presentations give unique perspectives on trends in
the field and  recent technical developments in the community, and offer
conclusions and recommendations to attendees.

The General Conference offers four-track presentations, mini-tutorials
and a debate:

Technical Track. Topics include:
      Class testing
      Deep Program Analysis
      Test Oracles
      Novel GUI Approaches, and more...

Applications Track. Topics include:
      Real-world experiences
      Novel tools
      User-Level analysis, and more...

Management Track. Topics include:
      Automatic tests
      Process experience
      Team approaches
      Managing OO integration, and more...

Vendor Track: Selected vendors present their products and/or services to
guide the testing process. The vendor track is specifically reviewed for
technical content -- no high-pressure sales pitches are allowed; come to
learn, not to be sold!

A two-day Tools Expo brings together leading suppliers of testing solu-
tions.

Mini-Tutorial: Explore the pros and cons of outsourcing software test-
ing.

Debate: Examine one of today's hottest topics, Model-Checking and the
Verification of Concurrent Programs, and listen to the experience of
experts from Carnegie Mellon University in Pittsburgh, Pennsylvania,
Trinity College of Dublin, Ireland, Oxford University, Oxford, England,
and Universite de Liege, Belgium.

WHO SHOULD ATTEND
^^^^^^^^^^^^^^^^^
o  Lead senior quality assurance managers looking for powerful mainte-
   nance and testing techniques and an opportunity to evaluate today's
   tools.

o  All quality assurance and testing specialists, beginners and experts
   alike, who need exposure to authoritative sources for improving soft-
   ware test technology.

o  Programmers and developers who want to learn more about producing
   better quality code.

o  Maintenance technicians looking for techniques that control product
   degradation.

o  Technologists who want to catch up on the state-of-the-art techniques
   in software testing, quality assurance and quality control.

SPECIAL FEATURES
^^^^^^^^^^^^^^^^
The conference provides attendees with:

o  State-of-the-art information on software test methods.

o  Analysis of effectiveness through case studies and real-world experi-
   ences.

o  The latest developments in the software testing world presented by
   the industry's leading practitioners and researchers.

o  Identification of the techniques that have been the most and the
   least successful.

o  Vendors of significant, corresponding technology.

o  The available tools and services.

o  Networking: informal discussions with other attendees on common
   interests and concerns.

Quality Week '95 is sponsored by
Software Research, Inc.
San Francisco, California

INTERNATIONAL ADVISORY BOARD
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Boris Beizer, ANALYSIS, Inc.
William Bently, Bayer Corp.
Antonia Bertolino, IEI-CNR, Italy
Robert Binder, System Consulting, Inc.
Robert Birss, SunSoft
Michael Dyer, Dycon Systems
Walter Ellis, Software Process and Metrics
Marie-Claude Gaudel, Universite de Paris, France
Carlo Ghezzi, Politecnico di Milano, Italy
Dick Hamlet, Portland State University
Mary Jean Harrold, Clemson University
William Howden, University of California, San Diego
Micheal Mac-an-Airchinnigh, University of Dublin, Ireland
Edward Miller (Program Chairman), Software Research, Inc.
John Musa, AT&T Bell Laboratories
Tom Ostrand, Siemens Corporate Research
Norman F. Schneidewind, Naval Postgraduate School
Keith Stobie, Informix
William Wolters, AT&T Bell Laboratories
James Woodcock, Oxford University, England


REGISTRATION FOR QUALITY WEEK
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
REGISTRATION: Please pay by check or with your Company Purchase Order.
The entire Conference Fee is payable prior to the program. Make checks
payable to SR Institute, Inc. Registration is accepted up to the time of
the meeting; on-site registration begins at 7:00 a.m., subject to space
availability. No cancellation fee until 5 May 1995; a service charge of
$125 after 5 May 1995 applies. Call the registrar to obtain your cancel-
lation number.

FEES: Registration includes all material, Conference Lunches, Refresh-
ments and invitation to the Cocktail Party.

Registered & Paid       Before          After           Group Rates
                        28 April        28 April
Tutorial Day            $300            $350            no discount
3-Day Conference        $750            $850            10% discount
COMBINED                $950            $1050           10% discount

SAVE: Send your team of software testing specialists and benefit from
the reduced group rate. If you register two or more representatives at
one time, you may deduct 10% of the fee for each attendee from the
Conference or COMBINED price only.

CONFERENCE HOTEL: Quality Week will be held at the luxurious landmark
Sheraton Palace Hotel, San Francisco, CA, located in the very heart of
the downtown business district. The Sheraton Palace has welcomed vaca-
tioners and business persons with its famous hospitality. Enjoy the best
in facilities, restaurants, clubs, theaters, shops, and points of
interest.

Please complete and mail form together with your check or purchase order
to:

--------------------------cut here--------------------------------------
SR Institute
901 Minnesota Street
San Francisco, CA 94107 USA USA

Or request information through e-mail: qw@soft.com
Or FAX Your Registration: [+1] (415) 550-3030

Please Type or Print:

Name: __________________________________________________________________
Title: _________________________________________________________________
Company: _______________________________________________________________
Street: ________________________________________________________________
City: __________________________________________________________________
State or Province: _____________________________________________________
ZIP or Postal Code: ____________________________________________________
Country: _______________________________________________________________
Phone: _________________________________________________________________
FAX: ___________________________________________________________________

Note: Please copy this form for multiple registration.

Please Check One:
[ ] Tutorials
[ ] 3-Day Conference
[ ] Tutorials and Conference COMBINED

[ ] Check Enclosed      [ ] P.O. Number Enclosed

========================================================================

                         SPECIAL ISSUE REVIEW:
                    OBJECT-ORIENTED SOFTWARE TESTING

                              Part 1 of 3

Note: These are reviews and commentary on a special section of the Com-
munications of the ACM devoted to Object-Oriented Software Testing (C.
ACM, Vol. 37, No. 9, September 1994, p. 30ff).

The September Edition of the ACM magazine, COMMUNICATIONS OF THE ACM,
was devoted to Object-Oriented Software Testing.  The six articles were:

o  "Object Oriented Integration Testing" by Paul C. Jorgensen and Carl
   Erickson
o  "Experiences with Cluster and Class Testing" by Gail C. Murphy, Paul
   Townsend, and Pok Sze Wong
o  "Automated Testing from Object Models" by Robert M. Poston
o  "Integrating Object-Oriented Testing and Development Processes" by
   John D. McGregor and Timothy D. Korson
o  "Testing `In A Perfect World'" by Thomas R. Arnold and William A.
   Fuson
o  "Design for Testability in Object-Oriented Systems" by Robert V.
   Binder
        o       o       o       o       o       o       o

                 "Object-Oriented Integration Testing,"
                by Paul C. Jorgensen and Carl Erickson
           (C. ACM, Vol. 37, No. 9, September 1994, p. 30ff).

The main issue, according to Jorgensen and Erickson, is how to relate
structure to behavior.  Applying current technologies in the object
oriented integration testing makes the problem a process issue as well.
So the article devotes a lot of space to history of and details about
various processes, but focuses on stimulus-response models.  The key
problem, they say, is to contend with dynamic binding in a less specific
way than for regular static programs -- possibly by doing something akin
to handling the dynamics of "declarative programs" (a la some formal
correctness work).  Inevitably this leads to to fundamentals: the need
to deal with path-based sequences of actions, and the need to subset
such sequences to take the object class hierarchy into account.

Path, or path-like, verification of main properties works fine, and they
work an ATM (Automated Teller Machine) model in some detail to demon-
strate their points.  The good news is that the conjecture that existing
techniques can be "lifted" to the object oriented context turns out to
be true, and the process relatively straightforward as well.

             "Experiences with Cluster and Class Testing,"
           by Gail C. Murphy, Paul Townsend, and Pok Sze Wong
           (C. ACM, Vol. 37, No. 9, September 1994, p. 39ff).

This paper is an instance of what there is too little of in the commun-
ity: a cogent exposition of a real-world experience.  Here the example
is of TRACS (Trouble Advisor for Customer Services), which was tested
using home-brew testware to automate execution of scripts against the
object classes making up the product.  The main thing they found was
that they got the best benefit by running tests in clusters, i.e. affi-
liated groups of tests of the object classes.

Most of the paper (by volume) is focused on intimate details of how the
tooling was done, and much of this appears to be pretty specific to the
application.  It would have been nicer if we had been made privy to some
of the details of the kind (and number?) of defects found, and/or the
amount of savings they obtained through their automated approach.  Even
so, the paper makes a good case for automation and is recommended as an
enthusiasm-builder for any automated testing project.

-Edward F. Miller

                         o   o   o   o   o   o

                            End Part 1 of 3
                To Be Continued in April 1995 TTN/Online

========================================================================

                            CALL FOR PAPERS
                                AQuIS'96
    Third International Conference on Achieving Quality in Software
                    Florence, January 24 - 26, 1996

                    Organized by QUALITAL,  IEI-CNR
                       in cooperation with CESVIT
                        Sponsored by IFIP WG5.4

General Chair:     Giacomo Bucci, - University of Florence - I
Program Chair:     Sandro Bologna - ENEA - I
Program co-Chairs: Motoei Azuma   - Waseda University - Japan
                   Edward Miller  - Software Research - USA

Organization Chair:Piero De Risi  - QUALITAL - I

The  objective of AQuIS'96 series is to  provide a platform for technol-
ogy and knowledge  transfer between  academia, industry  and research
institutions, in the software quality field.

The final program will include presentations of research papers and
experience reports, invited talks and state of the art reports in the
field of software quality. In addition, AQuIS'96 will try for the first
time to address specifically the topic of knowledge-based systems (KBS)
quality and the impact of object oriented technology on quality. The
development of KBS technology has created a strong interest in applying
this technology to critical applications.  There is a growing interest
to the problem of assuring KBS quality and some of the good ideas from
conventional software quality could be transferred to KBS quality.
Object oriented technology has pervaded the  entire  field  of software
engineering, changing the way in which programs are specified and
developed. While much research has been performed on object-oriented
programming, little has been said on its impact on quality.

Submission are invited in the following areas:
o  Why so little progress in software quality engineering ?
o  Why so little progress in quantitative evaluation of quality
   attributes ?
o  Industrial interest to software quality.
o  Software quality vs.  software development process and tools.
o  Software quality vs.  software engineers skills and management.
o  Software quality vs.  formal methods.
o  Software quality vs.  empirical methods.
o  Conventional software systems verification, validation, testing.
o  Issues in KBS quality, verification, validation and testing.
o  KBS quality vs.  conventional software systems quality.
o  Impact of object-oriented programming on software quality.
o  Quality of object-oriented systems.
o  Design for quality.
o  Quality management.
o  Quality standards.
o  Tools for managing quality.

Four copies (in English) of original work, limited to 12 pages (5000
words), must reach the Conference Secretariat before April 3rd, 1995.
Best papers may be selected for publications in a special issue of *The
Journal of Systems and Software* (Eslevier Science Inc., New York)

** IMPORTANT DATES **
   April 3rd, 1995          Final paper submission deadline
   June 30th, 1995          Notification of final acceptance
   September 1st, 1995      Camera Ready Copy due

** Conference Secretariat/Contact Address **
   Marcello Traversi      CESVIT
   Palazzina Lorenese, Viale Strozzi 1
   50129, Firenze, Italy
   Tel.: +39-55-46190 ---- Fax.: +39-55-485345
   e-mail: AQUIS96@AGUIRRE.ING.UNIFI.IT
           TRAVERSI@AGUIRRE.ING.UNIFI.IT


========================================================================

           AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
                      SOFTWARE TEST DOCUMENTATION
                       by Harry M. Sneed, Germany

                             (Part 1 of 3)

(Editor's note:  This article will appear in three parts over the this
and the following two issues of the TTN/Online Edition.)

Introduction

The ANSI/IEEE Standard for Software Test Documentation calls for the
production of a series of documents which verify that the testing pro-
cess has been carried out properly and that the test objectives have
been met. Without automated tool support the costs of such test documen-
tation are prohibitive in all but the most trivial projects.

This paper describes a test system which provides such a service. It
begins with a test plan frame as a master class, from which the class
test design is then derived. From it various test procedure classes are
generated which serve to generate the individual objects - test cases
specified in the form of pre- and post condition assertions to be exe-
cuted in test suites.

ANSI/IEEE Standard 829 "Software Test Documentation" calls for the 
production of a set of documents to ensure the quality of software 
testing
(1). The ISO-9000 Standard refers to the ANSI Standards as a basis for
test documentation (2). Any organization seeking certification by the
American Software Engineering Institute S.E.I. must provide a minimum
subset of the documents specified (3). It has now become obvious that
the test documents required by the Standard 829 will be a prerequisite
to almost any certification process for software producers.

Required Test Documents

The 8 test documents required by the ANSI standard are a

o  Testplan,
o  Test Design Specification,
o  Testcase Specification,
o  Test Procedure,
o  Test Item Transmittal Report
o  Test Log,
o  Test Incident Report
o  Test Summary Report

The contents of the Testplan have been specified in detail. The Intro-
duction describes the objectives, the background, the scope and the
references. The subsequent sections define the test objects, the func-
tions to be tested, the functions not to be tested, the test approach,
the test environment, the test acceptance criteria, the test results,
the test activities, the test requirements, the test responsibilities,
the personnel, the deadlines, the risks and the contingency measures.

The testplan is an organizational document which can not be automati-
cally created. However, it is entirely possible to set up a frame text
which can be copied and a prompting editor which can prompt the user
through the enhancement of the frame text by presenting him various
alternatives to particular decisions. The standard test plan frame is a
classical example for reusing text structures.

The test design specification outlines the requirements of the test. In
particular, it identifies the features or general functions to be
tested, details the test approach, proposes a rationale for the defini-
tion of test cases and establishes the pass/fail criteria. In effect, it
provides a test baseline.

The Testcase Specification describes precisely what is to be tested.
This can be a very voluminous document, similar to a software require-
ments documentation. It requires an identification of each test case, a
description of test items, a reference to the functions to be tested,
the inputs and the expected outputs as well as the test case dependen-
cies. To really prepare it properly one would need a formal language to
express the features of test cases, i.e. a test specification language.
Since there is no standard for such a language, each user is obliged to
define his own. This lack of a common standard test language is cer-
tainly an obstacle to implementing the ANSI/IEEE Standard.

The Test Procedure Specification is a document for describing how the
test cases are to be executed. The test scenario and the test suites are
prescribed here, but as in the case of the testcase specification, their
contents are open to interpretation. This stems from the fact that the
test execution is highly dependent upon the test environment and the
type of system being tested. Realtime systems require a totally dif-
ferent test procedure than do business systems and testing PC software
is a totally different matter than testing process control systems or
transaction process systems on a mainframe.  As a result, each organiza-
tion will have to maintain diverse classes of test procedures, one for
each environment and target system type.

The Test Item Transmittal Report is a list of all those configuration
items to be submitted to testing. It includes not only the modules and
procedures but also the files, databases, panels, and other data items.
In addition, it identifies the drivers, stubs, generators, auditors and
other test aides to be used. Each test item is identified with name,
version and author i.e. responsible person and its current status given.

The Test Log is intended as an audit on what occurs during test execu-
tion. It should record the cases, or transactions processed, the test
paths traversed, the test coverage and the intermediate as well as the
final results. It is important as a means of documenting the quantity
and quality of the test, in addition to the test effort.

The Test Incident Report is a type of problem report. It reports all
problems which may have occurred during testing. That includes errors,
faults, discrepancies and any other behavior which differs from that
expected. The tester is requested to record the incident with time,
testcase, results and accompanying circumstances. The test incident
report is used as a basis for quality assurance, defect analysis and, of
course for error correction. It can be easily standardized. Samples are
provided by the IEEE standard.

The final document is the test summary report. It assesses the
comprehensiveness of the testing process, summarizes the test results
and documents the test effort. It is intended as an information for
management evaluation, but it can also serve as a basis for making a
cost/benefit analysis and an overall quality assessment.

These 8 documents are, in their totality, every bit as extensive as the
software documentation itself. Thus, to produce them one would have to
exert an effort equivalent to the development documentation. This means
doubling software documentation costs.

                    o    o    o    o    o    o    o

                             End of Part 1
                To Be Continued in April 1995 TTN/Online

========================================================================
---------------------->>>  CALENDAR OF EVENTS  <<<----------------------
========================================================================

The following is a partial list of upcoming events of interest.  ("o"
indicates Software Research will participate in these events.)

   +  March 19-23: ObjectWorld '95
      Hynes Convention Center
      Boston, Massachusetts, USA
      Contact: IDG World Expo
      tel: 800-225-4698

   +  April 5-7: 4th European Workshop on Software Process Technology
      7th Conference on Software Engineering Environments
      Netherlands
      Contact: Prof. Dr. Gregor Engels
      tel: +31 71-277063

   +  April 9-14: 7th Annual Software Technology Conference
      Salt Lake City, Utah, USA
      Contact: Dana Dovenbarger, Lynne Wade
      tel: 801-777-7411 DSN 458-7411
      fax: 801-777-8069 DSN 458-8069
      email: dovenbar@oodis01.hill.af.mil
             wadel@hillwpos.hill.af.mil

   +  April 24-28: International Conference on Software Engineering
      (ICSE'95)
      Westin Hotel, Seattle, Washington, USA
      Contact: Dr. Dewayne Perry
      tel: 908-582-2529
      fax: 908-582-7550
      email: dep@research.att.com

   o  May 30 - June 2: Eighth International Software Quality Week (QW95)
      Sheraton Palace Hotel, San Francisco, CA, USA
      Contact: Rita Bral
      tel: [+1] (415) 550-3020
      fax: [+1] (415) 550-3030
      email: qw@soft.com


========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is forwarded on the 15th of each month to sub-
scribers via InterNet.  To have your event listed in an upcoming issue,
please e-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com".  The TTN On-Line submittal policy is as fol-
lows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted items should not exceed 68 lines (one page).
o  Publication of submitted items is determined by Software Research,
   Inc., and may be edited as necessary.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To request a FREE subscription or submit articles, please send E-mail to
"ttn@soft.com".  For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header.  To have
your name added to the subscription list for the quarterly hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.

To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.

Note:  To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                      San Francisco, CA 94107 USA

                         Phone: (415) 550-3020
                       Toll Free: (800) 942-SOFT
                          FAX: (415) 550-3030
                          E-mail: ttn@soft.com

                               ## End ##