sss ssss      rrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr


         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======              April 1995             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the world software testing commun-
ity.

(c) Copyright 1995 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.

TRADEMARKS:  Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc.  All other systems are either
trademarks or registered trademarks of their respective companies.

========================================================================

INSIDE THIS ISSUE:

   o  CONFERENCE ANNOUNCEMENT
      EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)

   o  SPECIAL ISSUE REVIEW:
      OBJECT-ORIENTED SOFTWARE TESTING (Part 2 of 3)
      by Edward F. Miller, President, Software Research, Inc.

   o  AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
      SOFTWARE TEST DOCUMENTATION (Part 2 of 3)
      by Harry M. Sneed, Germany

   o  EDITORIAL BY EDWARD F. MILLER:
      WHY EVALUATIONS ARE CONSIDERED ESSENTIAL

   o  KRAZY KONTEST

   o  CALENDAR OF EVENTS

   o  TTN SUBMITTAL POLICY

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

************************************************************************
           EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)
************************************************************************

                       30 May 1995 -- 2 June 1995
            Sheraton Palace Hotel, San Francisco, California
             Conference Theme: The Client-Server Revolution

QW `95 is the premier technological conference of its kind, combining
the newest applications, technology, and management techniques. Software
Quality Week, now in its eighth year, focuses on advances in client/
server technologies, software test technology, quality control, software
test process, managing OO integration, software safety, and test automa-
tion. Quality Week `95 offers an exchange of information between academ-
icians and practitioners that no other conference can provide.

The Client/Server Revolution is sweeping all of computing, changing the
way we think about organizing complex systems, how we develop and test
those systems, and changing our approach to quality control questions
for multi-user, multi-platform, heterogeneous environments. At the same
time, the Client/Server Revolution is forcing a closer look at critical
development strategies, at how we think about software testing, and at
the methods and approaches we use to get the job done. The Eighth Inter-
national Software Quality Week covers advances in software analysis and
review technologies, along with formal methods and empirical strategies
for large-scale as well as small-scale projects. Quality Week competi-
tive edge to dominate your industry.

PROGRAM DESCRIPTION
^^^^^^^^^^^^^^^^^^^
The Pre-Conference Tutorial Day offers expert insights on ten key topic
areas.  The Keynote presentations give unique perspectives on trends in
the field and  recent technical developments in the community, and offer
conclusions and recommendations to attendees.

The General Conference offers four-track presentations, mini-tutorials
and a debate:

Technical Track. Topics include:
      Class testing
      Deep Program Analysis
      Test Oracles
      Novel GUI Approaches, and more...

Applications Track. Topics include:
      Real-world experiences
      Novel tools
      User-Level analysis, and more...

Management Track. Topics include:
      Automatic tests
      Process experience
      Team approaches
      Managing OO integration, and more...

Vendor Track: Selected vendors present their products and/or services to
guide the testing process. The vendor track is specifically reviewed for
technical content -- no high-pressure sales pitches are allowed; come to
learn, not to be sold!

A two-day Tools Expo brings together leading suppliers of testing solu-
tions.

Mini-Tutorial: Explore the pros and cons of outsourcing software test-
ing.

Debate: Examine one of today's hottest topics, Model-Checking and the
Verification of Concurrent Programs, and listen to the experience of
experts from Carnegie Mellon University in Pittsburgh, Pennsylvania,
Trinity College of Dublin, Ireland, Oxford University, Oxford, England,
and Universite de Liege, Belgium.

WHO SHOULD ATTEND
^^^^^^^^^^^^^^^^^
o  Lead senior quality assurance managers looking for powerful mainte-
   nance and testing techniques and an opportunity to evaluate today's
   tools.

o  All quality assurance and testing specialists, beginners and experts
   alike, who need exposure to authoritative sources for improving soft-
   ware test technology.

o  Programmers and developers who want to learn more about producing
   better quality code.

o  Maintenance technicians looking for techniques that control product
   degradation.

o  Technologists who want to catch up on the state-of-the-art techniques
   in software testing, quality assurance and quality control.

Quality Week '95 is sponsored by
Software Research, Inc.
San Francisco, California

REGISTRATION FOR QUALITY WEEK
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
REGISTRATION: Please pay by check or with your Company Purchase Order.
The entire Conference Fee is payable prior to the program. Make checks
payable to SR Institute, Inc. Registration is accepted up to the time of
the meeting; on-site registration begins at 7:00 a.m., subject to space
availability. No cancellation fee until 5 May 1995; a service charge of
$125 after 5 May 1995 applies. Call the registrar to obtain your cancel-
lation number.

FEES: Registration includes all material, Conference Lunches, Refresh-
ments and invitation to the Cocktail Party.

Registered & Paid       Before          After           Group Rates
                        28 April        28 April
Tutorial Day            $300            $350            no discount
3-Day Conference        $750            $850            10% discount
COMBINED                $950            $1050           10% discount

SAVE: Send your team of software testing specialists and benefit from
the reduced group rate. If you register two or more representatives at
one time, you may deduct 10% of the fee for each attendee from the
Conference or COMBINED price only.

CONFERENCE HOTEL: Quality Week will be held at the luxurious landmark
Sheraton Palace Hotel, San Francisco, CA, located in the very heart of
the downtown business district. The Sheraton Palace has welcomed vaca-
tioners and business persons with its famous hospitality. Enjoy the best
in facilities, restaurants, clubs, theaters, shops, and points of
interest.

Please complete and mail form together with your check or purchase order
to:

--------------------------cut here--------------------------------------
SR Institute
901 Minnesota Street
San Francisco, CA 94107 USA USA

Or request information through e-mail: qw@soft.com
Or FAX Your Registration: [+1] (415) 550-3030

Please Type or Print:

Name: __________________________________________________________________
Title: _________________________________________________________________
Company: _______________________________________________________________
Street: ________________________________________________________________
City: __________________________________________________________________
State or Province: _____________________________________________________
ZIP or Postal Code: ____________________________________________________
Country: _______________________________________________________________
Phone: _________________________________________________________________
FAX: ___________________________________________________________________

Note: Please copy this form for multiple registration.

Please Check One:
[ ] Tutorials
[ ] 3-Day Conference
[ ] Tutorials and Conference COMBINED

[ ] Check Enclosed      [ ] P.O. Number Enclosed

========================================================================

                         SPECIAL ISSUE REVIEW:
                    OBJECT-ORIENTED SOFTWARE TESTING

                              Part 2 of 3

Note: These are reviews and commentary on a special section of the Com-
munications of the ACM devoted to Object-Oriented Software Testing (C.
ACM, Vol. 37, No. 9, September 1994, p. 30ff).

The September Edition of the ACM magazine, COMMUNICATIONS OF THE ACM,
was devoted to Object-Oriented Software Testing.  The six articles were:
"Object Oriented Integration Testing" by Paul C. Jorgensen and Carl
Erickson; "Experiences with Cluster and Class Testing" by Gail C.  Mur-
phy, Paul Townsend, and Pok Sze Wong; "Automated Testing from Object
Models" by Robert M. Poston; "Integrating Object-Oriented Testing and
Development Processes" by John D. McGregor and Timothy D. Korson; "Test-
ing `In A Perfect World'" by Thomas R. Arnold and William A. Fuson; and
"Design for Testability in Object-Oriented Systems" by Robert V.
Binder.

        o       o       o       o       o       o       o

                "Automated Testing from Object Models,"
                          by Robert M. Poston
           (C. ACM, Vol. 37, No. 9, September 1994, p. 48ff).

Bob Poston, a long-time contributor to software test technology and the
architect of the ``T'' product, provides a high level view of the soft-
ware life cycle with, and without, automated testing added into object
management technology.  The key is to bridge design information into
test information and Poston's way of doing this is to use ``T'' as the
bridge-ware.  The example worked in the article is a fairly good one,
illustrative of some of the most-critical issues but small enough to
work didactically.  What's lacking is a description of errors -- either
in the design or in the implementation -- that were detected by the
technique.  And, the cost justifications, while interesting, are --
perhaps like many cost justifications in CASE/CAST -- shaky and based on
too-optimistic assumptions.

    "Integrating Object-Oriented Testing and Development Processes,"
               by John D. McGregor and Timothy D. Korson
           (C. ACM, Vol. 37, No. 9, September 1994, p. 59ff).

This paper takes a rather different view of how to integrate OO technol-
ogy, testing technology, and ``development processes'', this time based
on the ``System  Architect' Synthesis'' (SASY) model, which differs from
the conventional waterfall model in that it includes (realistically
enough) a multitude of inter-stage feedback loops.  A lot of space is
devoted to elaborating -- one more time, one time too many? -- all of
the different types of OO approaches.  Finally it gets around to key
idea: you have to test a model, confirm the correctness of the model
relative to the implementation, and then all will be well.  It sounds a
little like a proof of correctness approach -- and the reader is thus
forewarned.  Indeed, it is and, forewarned -- and fore-armed? -- you can
wade through the rest of the paper with ease.  The bottom line: good
thinking and care with your mappings leads to a comprehensive test
approach.  Good luck in making it work!

                         o   o   o   o   o   o

                            End Part 2 of 3
                 To Be Continued in May 1995 TTN/Online

========================================================================

           AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
                      SOFTWARE TEST DOCUMENTATION
                       by Harry M. Sneed, Germany

                             (Part 2 of 3)

(Editor's note:  This article is the second in a series of three appear-
ing in the TTN/Online Edition.)

Introduction

The ANSI/IEEE Standard for Software Test Documentation calls for the
production of a series of documents which verify that the testing pro-
cess has been carried out properly and that the test objectives have
been met. Without automated tool support the costs of such test documen-
tation are prohibitive in all but the most trivial projects.

This paper describes a test system which provides such a service. It
begins with a test plan frame as a master class, from which the class
test design is then derived. From it various test procedure classes are
generated which serve to generate the individual objects - test cases
specified in the form of pre- and post condition assertions to be exe-
cuted in test suites.

ANSI/IEEE Standard 829 "Software Test Documentation" calls for the pro-
duction of a set of documents to ensure the quality of software testing
(1). The ISO-9000 Standard refers to the ANSI Standards as a basis for
test documentation (2). Any organization seeking certification by the
American Software Engineering Institute S.E.I. must provide a minimum
subset of the documents specified (3). It has now become obvious that
the test documents required by the Standard 829 will be a prerequisite
to almost any certification process for software producers.

Reuse in Test Documentation

The task addressed here is how to produce the test documents required by
the IEEE Standard in a cost effective manner. Costs are driven by manual
effort. Therefore, anything which reduces manual effort is useful.
Reducing manual effort can be accomplished in documentation in two ways:

o  by reusing existing documents and
o  by automating the documentation as a byproduct of testing.

Reusing existing documents can be done in the case of the

o  test plan,
o  test design,
o  test incident report and
o  test summary report

In each case a standard frame or parametrized sample is set up which
includes a formatted English text with variable words and phrases (4).

       3.2.4 Features to be Tested
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
       The following features of functions are to be
       tested:

       Functional Features

         1) **** (F-FEATURE) *********************
         2) **** (F-FEATURE) *********************
         3) **** (F-FEATURE) *********************

       Nonfunctional Features

         1) **** (NF-FEATURE) ********************
         2) *** (NF-FEATURE) *********************

       3.2.5  Features not to be Tested

       The following features or functions are not to
       be tested:
       ..............................................
       ..............................................

The standard test plan frame is an outline with constant English phrases
interdispersed with variable length strings of asterisks marking the
variable texts which have to be submitted. Embedded in the asterisks
strings are the names of the variables to be filled in by the user.

The user who wants to create a test plan copies the test plan frame into
his library an` uses a syntax driven editor to edit it. By positioning
the cursor on a variable text a prompt window is displayed giving the
user sample inputs. In the case of functional features these could be:

               SAMPLE FUNCTIONAL FEATURES
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
           --> Account opening
           --> Deposit with minus Balance
           --> Deposit with 0 Balance
           --> Deposit with plus Balance
           --> Withdrawal with minus Balance
           --> Withdrawal with plus Balance
           --> Withdrawal which exceeds Credit Limit

In the case of nonfunctional features the prompt window might appear as
follows:

                SAMPLE NON-FUNCTIONAL FEATURES
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            --> Response time  < 2 seconds
            --> Program Conformity Rate > 90 %
            --> C1 Branch Coverage Rate > 90 %
            --> Data Output Conformity Rage > 99,9%
            --> Error Rage < 0,009

The company quality assurance group or test support center is responsi-
ble for creating the sample prompt members and loading them into the
prompt library. The editor recognizes the prompt variable name in the
asterisks string and displays the appropriate member in the prompt win-
dow. The user may select one of the items in the sample list using cut
and paste techniques to move it to the text frame or he may be animated
by the samples to find a similar feature, item, requirement, etc.

The prompt texts may of course themselves contain variable phrases
marked by asterisks strings which refer to a further list of samples as
shown below

               ENVIRONMENTAL NEEDS TEST TOOLS
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
           --> Instrumenter for    ***(LANG)****
           --> File Generator for  ***(DB-SYSTEM)***
           --> Test Frame for      ***(SYSTEM)****
           --> File Auditor for    ***(DB-SYSTEM)***
           --> Program Auditor for ***(LANG)****

In this way the prompting samples can be nested in one another to allow
the user to go from the general to the specific in enhancing the origi-
nal document frame. At any time the document can be generated with what-
ever level of detail the user has reached by that time.

This method of test document production is similar to the program frame
method NETRON introduced by Prof. Basset in Toronto (5). By using it
correctly, one can reduce the document production effort by at least 50%
while at the same time producing a better quality of documents in terms
of uniformity, consistency, completeness and reproducibility.

                         o   o   o   o   o   o

                             End of Part 2
                 To Be Continued in May 1995 TTN/Online

========================================================================

                WHY EVALUATIONS ARE CONSIDERED ESSENTIAL

Software testing is a ALWAYS a difficult process -- one for which the
positive payoffs are very great, but for which the negative payoffs are
very painful.

Some software sellers would have you believe that their product, based
on a demo that they deliver to you, is the ``end all, be all, do all,
final solution'' to all of your needs.  Very often, this is what is
pitched at an on-site demo, normally attended by some very high-powered,
well practiced, sales people.

The reality of software test technology is that perhaps 1% of the appli-
cations fit the norm of process success that can be illustrated with a
prepared sales demo.  The other 99% require some kind of local
`testware' programming in order to succeed.

One can understand why companies would avoid evaluations: they KNOW that
their products are too limited in scope, and/or they don't have the
technical support needed to support you anyway.

This may be due to the fact that software testing is an immature market,
and thus is subject to a lot of sales hype...

An example of how confusing things are is how some vendors toss terms
around...  Terms that every vendor uses differently...  To their own
advantage...

The bottom line is: try it before you buy it.  The only software testing
solution that works for you is the one that you really SEE working for
you...

EDITORS NOTE:  This article is based on a lengthier Applications Note
that is available on request from SR.  Send e-mail to "info@soft.com"
and ask for the AN entitled "Why Product Evaluations are Essential".

========================================================================

                       APPLICATION NOTE AVAILABLE

Editor's note: In the February 1995 issue of the TTN/Online Edition, an
article appeared called "True Time and Widget Both Necessary!".  A more
detailed version is now available in either emailable or hard copy form
from Software Research.  You can request either one by emailing us at
``info@soft.com''.  Ask for the Application Note (also entitled "True
Time and Widget Modes Both Necessary!")

========================================================================

                             KRAZY KONTEST

Krazy Kontest is a technical challenge for testers! Each Krazy Kontest
focuses on a specific technical question or situation, and invites
responses.  While serious answers are expected they are not required.
What's Krazy is that we don't necessarily know if there *IS* a correct
answer to each Krazy Kontest situation.

We promise in each Krazy Kontest, scheduled to run a couple of months,
to listen and analyze everyone's response, to summarize all responses,
and to include the summary (including the best, most quotable quotes!)
in a future issue.

Be sure to identify your answer with the correct Krazy Kontest Serial
Number.  E-mail your responses to: ``ttn@soft.com'' making sure that the
Subject: line says ``Krazy Kontest''.

                       Krazy Kontest Number No. 1

Suppose that your C program "foo_1" contains the passage:

        foo_1(a,b,c,d,e)
        int a, b;
        float c, d;
        long e;
        {
        /* This is the expression to test... */
        e = (long) ((a + 1) / (exp(b, c) - d));
        }

where exp(..) is a function that returns a float, and the rest of the
program takes care of setting a, b, c and d to values you select.

What sets of initial values of a, b, c and d are the "best" test values
to use to make sure this expression is thoroughly tested?  Why?

========================================================================
---------------------->>>  CALENDAR OF EVENTS  <<<----------------------
========================================================================

The following is a partial list of upcoming events of interest.  ("o"
indicates Software Research will participate in these events.)

   +  April 24 - 26:
      IEEE Int'l Computer Performance and Dependability Symposium
      (IPDS'95)
      Erlangen, Germany
      Contact: Ravishankar K. Iyer
      tel: 217-333-9732
      fax: 217-244-5686
      email: iyer@crhc.uiuc.edu

   +  April 24 - 28: International Conference on Software Engineering
      (ICSE'95)
      Westin Hotel, Seattle, Washington, USA
      Contact: Dr. Dewayne Perry
      tel: 908-582-2529
      fax: 908-582-7550
      email: dep@research.att.com

   +  May 22 - 24:
      2nd Int'l Workshop on Automated and Algorithmic Debugging
      (AADEBUG '95)
      St Malo, France
      Contact: Mireille Ducasse
      fax: 33-99-28-64-58
      email: ducasse@irisa.fr

   +  May 22 - 25: Software Engineering Process Group Conference
      Boston, MA
      contact: Rhonda Green
      tel: 412-268-6467
      fax: 412-268-5758
      email: rrg@sei.cmu.edu

   o  May 30 - June 2: Eighth International Software Quality Week (QW95)
      Sheraton Palace Hotel, San Francisco, CA, USA
      Contact: Rita Bral
      tel: [+1] (415) 550-3020
      fax: [+1] (415) 550-3030
      email: qw@soft.com


========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is forwarded on the 15th of each month to sub-
scribers via InterNet.  To have your event listed in an upcoming issue,
please e-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com".  The TTN On-Line submittal policy is as fol-
lows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted items should not exceed 68 lines (one page).
o  Publication of submitted items is determined by Software Research,
   Inc., and may be edited as necessary.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To request a FREE subscription or submit articles, please send E-mail to
"ttn@soft.com".  For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header.  To have
your name added to the subscription list for the quarterly hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.

To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.

Note:  To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                      San Francisco, CA 94107 USA

                         Phone: (415) 550-3020
                       Toll Free: (800) 942-SOFT
                          FAX: (415) 550-3030
                          E-mail: ttn@soft.com

                               ## End ##