xxxxxxx     x
              xxxxxxxxxxxx   xx
            xxxxxxxxxxxx     xxxx
           xxxxxxxxx         xxxxx
           xxxxxxxx          xxxxx
            xxxxxxxx         xxxxx    xxx            x
               xxxxxx        xxxxx    xxxxxxx         xx
                    xx         xxx    xxxxxxxxx         xxx
                                 x    xxxxxxxxxx         xxxx
                                      xxxxxx              xxxxx
                       xx             xxxxxx              xxxxxx
      xx                 xxx          xxxxxx              xxxxxxx
        xxx               xxxxx       xxxxxx              xxxxxxx
           xxxx            xxxxxx     xxxxxx              xxxxxxx
             xxxx           xxxxxxx                      xxxxxxx
               xxx          xxxxxxxx                    xxxxxxx
      xxx       xx          xxxxxxxx                   xxxxxxx
        xxx                 xxxxxxxx                 xxxxxxx
          xx                xxxxxxxx             xxxxxxxxx
           xx              xxxxxxxxx             xxxxxxx
                          xxxxxxxxx   xx           xxxxx
         xxxxxxx        xxxxxxxxx     xxx           xxxxx
       xxxxxxxxxxxxxxxxxxxxxxxx       xxxx           xxxx
     xxxxxxxxxxxxxxxxxxxxxxxx         xxxxx           xxxx
       xxxxxxxxxxxxxxxxxxxx           xxxxxx           xxxx
         xx     xxxxxxxx              xxxxxx            xxxx
                                      xxxxxx             xxxxx
                                      xxxxxx              xxxxx
                             xxxxxxxxxxxxxxx    xxxxxxxxxxxxxxxxx
                             xxxxxxxxxxxxxxx       xxxxxxxxxxxxxxxx

         +===================================================+
         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======             August 1994             =======+
         +===================================================+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
to provide information of general use to the world software testing com-
munity.

(c) Copyright 1994 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.

TRADEMARKS:  Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc.  All other systems are either
trademarks or registered trademarks of their respective companies.

========================================================================

INSIDE THIS ISSUE:

   o  Featured Conference: ISSRE '94
   o  Preliminary Call for Papers:  AQuIS '96
   o  STW Product Seminars Scheduled
   o  X11virtual Now Part of CAPBAK/X
   o  Regression Testing: A Basic Overview
   o  Some Recommended Reading
   o  Call for Papers: 8th INTERNATIONAL SOFTWARE QUALITY WEEK
   o  Call for Papers: CASE '95
   o  Call for Participation: UniForum 1995
   o  Call for Participation: OOPSLA '94
   o  A Very Interesting Technical Question
   o  Upcoming Event: Dynamic Testing of Ada Programs
                      in the LIS environment
   o  Calendar of Events
   o  TTN Submittal Policy
   o  TTN Subscription Information



========================================================================

              F E A T U R E D   C O N F E R E N C E . . .

   Fifth International Symposium on Software Reliability Engineering
                               ISSRE '94

 The Fifth International Symposium on Software Reliability Engineering
 (ISSRE'94) will be held at Doubletree Inn Hotel, Monterey, California,
USA, from November 6-9, 1994.  ISSRE is a major symposium in the emerg-
ing field of software reliability engineering.

The conference theme of ISSRE'94 is "Achieving Reliable Software Through
Testing, Verification and Validation."  The technical sessions include
Formal Methods, Safety, Modeling, Measurement, Testing, Industry
Reports, Tools, as well as several panel discussions.  ISSRE'94 also
features tutorials and tool fairs.

The tutorials include: A Guide to ISSRE'94,  Software Reliability
Engineering Practice, Software Reliability, Techniques and Tools,
Orthogonal Defect Classification, Methodology for Software Quality
Metrics and Formal Techniques for Safety-Critical Software Development.

For advance program and registration material, contact the IEEE Computer
Society, 1730 Massachusetts Ave., N.W., Washington, DC 20036-1992,
Phone: (202)371-1013, Fax: (202)728-0884.  For hotel reservation, con-
tact Doubletree Hotel, 2 Portola Plaza, Monterey, CA 93940, USA, Phone:
(408)649-4511 Fax: (408)649-4115.

========================================================================

                 Preliminary CALL FOR PAPERS: AQuIS'96

    Third International Conference on Achieving Quality in Software
                    Florence, January 24 - 26, 1996

General Chair: Giacomo Bucci, - University of Florence - I

Program Chair: Sandro Bologna - ENEA - I

Program co-Chairs: Motoei Azuma - Waseda University - Japan
                   Edward Miller - Software Research - USA

The objective of AQuIS'96 series is to provide a platform for technology
and knowledge transfer between academia, industry and research institu-
tions, in the software quality field.

The final program will include presentations of research papers and
experience reports, invited talks and state of arts reports in the field
of software quality.  In addition, AQuIS'96 will try for the first time
to address specifically the topic of knowledge-based systems (KBS) qual-
ity and the impact of object oriented technology on quality. The
development of KBS technology has created a strong interest in applying
this technology to critical applications.  There is a growing interest
to the problem of assuring KBS quality and some of the good ideas from
conventional software quality could be transferred to KBS quality.
Object oriented technology has pervaded the entire field of software
engineering, changing the way in which programs are specified and
developed. While much research has been performed on object-oriented
programming, little has been said on its impact on quality.

Submission are invited in, among others, the following areas:

   -- Why so little progress in software quality engineering?
   -- Why so little progress in quantitative evaluation of quality?
   -- Industrial interest to software quality.
   -- Conventional software systems verification, validation, testing.
   -- Impact of object-oriented programming on software quality.
   -- Quality of object-oriented systems.
   -- Design for quality.
   -- Quality management.
   -- Quality standards.
   -- Tools for managing quality.

Four copies (in English) of original work, limited to 12 pages (5000
words), must reach the Conference Secretariat before April 3rd, 1995.
Best papers may be selected for publications in a special issue of *The
Journal of Systems and Software* (Eslevier Science Inc., New York)

One-page abstracts are due January 2nd, 1995.

Contact: Rossella Cortesi CESVIT, Palazzina Lorenese, Viale Strozzi 1, 50129,
Firenze, Italy, Tel.: +39-55-485333 ---- Fax.: +39-55-485345, e-mail:
AQUIS96@AGUIRRE.ING.UNIFI.IT

========================================================================

                     STW PRODUCT SEMINARS SCHEDULED

Anxious to learn about state-of-the-art solutions for software testing?

SR's next set of free product seminars will be held at locations around
the USA, as follows:

   19 September 1994: Valley Forge, PA
   23 September 1994: Tysons Corners, VA

   14 October 1994: San Diego, CA
   19 October 1994: Seattle, WA
   20 October 1994: Denver, CO
   21 October 1994: Santa Clara, CA

YOU WILL GAIN . . .  An understanding of how to evaluate testing tools.
Knowledge of the major testing strategies and how to implement them.  A
realistic approach to automated testing to improve quality and reduce
cycle times.  Confidence that yours can be the highest quality software
available.

All seminars start at 8:30 AM and run to 12:00 and include a half-hour
break at 10:00 AM.  Hotel locations will be included in the registration
package.  For complete information contact SR at "info@soft.com".

========================================================================

                    X11virtual Now Part of CAPBAK/X

X11virtual, having been distributed to hundreds of sites worldwide as
the result of our ``Virtually Free Software'' promotion, has now been
bound in with the CAPBAK/X product set.

The X11 Virtual Display System/tm (X11virtual/tm) is an exciting new
technology from Software Research (SR) that permits you to run GUI
applications and scripts in the background, just as if each one was con-
nected to a separate screen.

This technology was developed for load testing from a single worksta-
tion.  X11virtual can be used with our testing tools, or any other ven-
dors' testing tools. It can also be used by anyone wishing to put a
screen intensive application into the background in order to free the
screen for other uses. X11virtual is based on X11R5 and runs on the fol-
lowing platforms: SPARC X11, Silicon Graphics under IRIX 5.x, RS/6000
under AIX 3.2.x, and HP-9000/7XX under HP/UX 9.x.

========================================================================

                 REGRESSION TESTING:  A BASIC OVERVIEW

(NOTE:  This short piece on test terminology is promoted by recent dis-
cussions in "comp.software.testing" about the meaning of the term
"regression testing".ed.)

The notion of regression testing has been around for some time:  you
make sure, by independent means, that a change you've just made to the
system you're working on doesn't introduce NEW behaviors.

Prior to the early 80's this work was mostly done manually.  You have
seen it, or done it, or heard of someone who still does this:  a tester
or team of testers spends hours or days or weeks putting a program
through its paces.  He tries to think of every combination of every
operation, and every possible way to run the program, and tries to gauge
the results just by "eyeballing" them.  (Believe it or not, some early
software testers used VCR's to record what testers did so they could
analyze the results.)

For the most part, this process works. If an operation fails, it is usu-
ally very obvious that it failed.  But if a code change meant a very
slight alteration in behavior, a tester may not catch it, particularly
if it is not visually evident.  In addition, many testers are familiar
enough with computers and the software in question to know instinctively
what *not* to do.  But in the field, you can't always trust your users
to know that.  Software testers often say: if there is a way to make the
product crash, the users will surely find it!

Right after the introduction of capture-playback tools, testing confer-
ences in the mid-80's set the stage for discussions of automating
regression testing.  Capture-playback tools let a test engineer capture
his work and play it back later.  Automating the testing process helped
sort out the unpredicted changes that so often take place: does the
overall application still do what it was designed to do?

Typically the capture/playback engine is used on a test-by-test basis.
It is used to test individual parts; those individual tests can then be
organized into groups, based on your understanding of the application,
and then, once you are satisfied with the groups of tests, they can be
put together to test the entire product or suite.  Automating it meant
that testers could fire off a test or series of tests and then do other
things: analyze previous results, take a coffee break, etc.  It also
meant that, rather than writing a script to isolate one module, and pos-
sibly missing how that module interacted with others, the module could
be tested individually, and *also* tested as part of the whole.

Changing even two lines of code can have far-reaching effects; you may
change a function call, thinking of the one or two obvious places where
it is used, but what about the third or fourth instance where it is
used?  Automating the testing of the entire product increases your
chances of finding errors in those less-obvious places.

With the use of differencing as part of the testing process, it is also
possible to look for intentional changes:  you want diff to *fail* in
cases where the original baseline was wrong (there was a defect, or you
want to change the behavior for other reasons).

So while capture/playback facilities can be useful during development to
check small pieces of code, it is of great value when used with an
automated test manager to test the entire product.

========================================================================

                      SOME RECOMMENDED READING...

Recently, Brian Marick posted a very useful FAQ on InterNet (message id
marick.776312866@hal.cs.uiuc.edu); a Testing tools supplier list.

The list includes tools for test design, implementation, evaluation, and
static analysis, as well as some miscellaneous tools.  Several of the
tools mentioned are from Software Research's Software TestWorks (STW).

                           *       *       *

In the June 1994 issue of CrossTalk, the Journal of Defense Software
Engineering, an article appears called "Testing for the Future".  It was
written by members of the Software Research staff.  It covers some of
the reasons for specification-based testing, and the benefits (as well
as the costs) involved.

You will find the article useful and informative.


========================================================================

            INTERNATIONAL SOFTWARE QUALITY WEEK `95 (QW '95)

             Conference Theme: The Client-Server Revolution
            San Francisco, California  30 May - 2 June 1995

QW `95 is the eighth in a continuing series of Conferences focusing on
advances in software test technology, quality control, risk management,
software safety, and test automation. Software analysis methodologies,
supported by advanced automated software test methods, promise major
advances in system quality and reliability, assuming continued competi-
tiveness.

The QW `95 program consists of four days of mini-tutorials, panels,
techni cal papers and workshops that focus on software test automation
and new technology. QW `95 provides the Software Testing and QA/QC com-
munity with:

   Quality Assurance and Test involvement in the development process.
   Exchange of information among technologists.
   State-of-the-art information on software test methods.
   Analysis of effectiveness through case studies.
   Vendor Technical Presentations
   Two-Day Vendor Show


We are soliciting 45 and 90 minute presentations or participation in a
panel discussion on any area of testing and automation, including: New
and Novel Test Methods, Automated Inspection, CASE/CAST Technology,
Client-Server Computing, Cost / Schedule Estimation, and many other
topics (call Software Research for a more complete list).

SUBMISSION INFORMATION: Abstracts should be 2 - 4 pages long, with
enough detail to give reviewers an understanding of the final paper,
including a rough outline of its contents. Indicate if the most likely
audience is technical, managerial or application-oriented.

In addition, please include:
   + A cover page with the paper title, complete mailing and e-mail
     address(es), and telephone and FAX number(s) of each author.
   + A list of keywords describing the paper.
   + A brief biographical sketch of each author.

Send abstracts to: Ms. Rita Bral, Software Research Institute, 625 Third
Street, San Francisco, CA 94107 USA.  For information on the confer-
ence, E-mail your request to qw@soft.com, phone SR/Institute at (415)
550-3020, or FAX SR/Institute at (415) 550-3030.

========================================================================

                        CALL FOR PAPERS CASE '95
                   SEVENTH INTERNATIONAL WORKSHOP ON
                  COMPUTER-AIDED SOFTWARE ENGINEERING

                        Toronto, Ontario, Canada
                            July 9-14, 1995

Sponsored by International Workshop on CASE, Inc.  IEEE Computer
Society's Technical Committee on Software Engineering The Institute of
Electrical and Electronics Engineers, Inc.

            Evolutionary Engineering of Systems and Software

Software and systems development, management, and evolution will undergo
dramatic change in the last half of the 1990s. The growing reliance on
client/server architectures, the growth of industrial strength object-
oriented techniques, advances in systems engineering automation, the
emergence of powerful repository capabilities, multimedia opportunities,
and other factors are already having profound impact on the way systems
are designed and implemented. CASE methods, techniques, tools, and
environments are evolving to meet these many new needs. Indeed, the CASE
in 1995 is far different from the CASE of the late 1980s. Now that CASE
is past its initial marketplace boom and bust cycle, attention is turn-
ing to the value of CASE technology and the CASE approach for the long
term in systems and software engineering.

CASE '95 is the premier conference and working meeting of the CASE
field. It is the crossroads of practitioners and researchers from indus-
try, academia, and government. Workshops, refereed papers, presenta-
tions, tutorials, and working meetings explore and report on the most
important practical, applied, experimental, and theoretical work
currently conducted in the CASE field for systems and software evolu-
tion.

For more information, contact one of the Program Co-Chairs:

Elliot Chikofsky, DMR Group, 404 Wyman Street, Suite 450, Waltham, MA
02154, USA. Voice: 617-487-9026 Fax: 617-487-5752 E-mail:
e.chikofsky@computer.org

Hausi A. Muller, Department of Computer Science, University of Victoria,
Engineering Office Wing (EOW 337), P.O. Box 3055, Victoria, B.C. V8W 3P6
Canada.  Voice: 604-721-7630 Fax: 604-721-7292 E-mail: hausi@csr.uvic.ca

========================================================================

                             UniForum 1995
                         Call for Participation

UniForum is the world-class leader in conference and trade shows for the
UNIX and Open Systems environment.  The 1994 UniForum Conference and
Exposition drew 40,000 users and leaders.  UniForum 1995 will be the
premier Open Systems event of the year.

You are cordially invited to participate in this prestigious event by
submitting your session and speaker proposals for the UniForum 1995
Conference.

UniForum 1995 Conference currently consists of, but is not limited to,
the following tracks:

        *       PCs in an Open Systems Environment
        *       Mainframes in an Open Systems Environment
        *       Internet/Information Highway
        *       Mission-Critical Open Systems
        *       Large, High-Volume Solutions
        *       Futures in Open Systems
        *       Industry Trends & Issues


For a complete list of topics, proposal format, and to receive a ques-
tionnaire, please contact Deborah Bonnin, Conference and Seminar
Manager, UniForum Association, 2901 Tasman Drive, Suite 205, Santa
Clara, CA  95054-1100.

========================================================================

                 A VERY INTERESTING TECHNICAL QUESTION!

(NOTE: Here is an interesting query posted to "comp.software.testing"
recently.  We found it thought provoking, to say the least.  Responses
can come to "ttn@soft.com" or to Dr. Berlejung direct.)

   From: ae59@iamk4525.mathematik.uni-karlsruhe.de (Heiner Berlejung)
   Newsgroups: comp.software-eng,comp.software.testing
   Subject: RFI: How to audit numerics in SRS
   Date: 4 Aug 94 09:41:12 GMT
   Organization: University of Karlsruhe, Germany

   Hello world,

   I wonder how numerical algorithms and programs are assessed by audi-
   tors or persons who are responsible for quality of software in the
   application of safety-related systems (SRS), i.e. military, power
   industries, traffic, medicine a.s.o..

   NOTE : x:= 1e20; y := x;
             ok1 := x+1-y = 1; { false }
             ok2 := x-y+1 = 1; { true for IEEE 754-85 double}

   (1) Assessment of numerical programs/results (algorithms a.s.o.)

   (2) I define numerical programs as programs (or part of software sys-
   tems) which contain low or/and high mathematical objects usually
   using floating-point data formats in order to achieve high perfor-
   mance calculation.  Have you any other definition for numerical pro-
   grams ?

   (3) There is a standard proposal 65A(Secretariat)123/122 that defines
   some quality requirements. Has this standard any significance ?  Are
   other standards suitable ?

   (4) Are there strategies to avoid floating-point numbers ?  (i.e.
   using rational arithmetic in lisp ?)

   (5) Are there any test strategies prescribed?

   (6) Are there any attempts of verification/validation?

   If there are enough answers, I will post a summary.

   Thanks in advance.

---------------------------------------------------------------------------
Heiner Berlejung                           Institute of Applied Mathematics
Email: ae59@rz.uni-karlsruhe.de            University of Karlsruhe
Phone:+49 721 377934 / Fax:+49 721 385979  P.O. Box 6980,D-76128 Karlsruhe
---------------------------------------------------------------------------

========================================================================

         DYNAMIC TESTING OF ADA PROGRAMS IN THE LIS ENVIRONMENT

    ISTITUTO ELABORAZIONE INFORMAZIONE - CENTRO NAZIONALE RICERCHE
                              Pisa, Italy

                             Presented by:
                 Ing. Mario Fusani & Ing. Carlo Carlesi

An experimental seminar session will be conducted where some features of
newly created LIS (Software Engineering Laboratory) will be shown.  LIS
is a network based interactive environment (located in Pisa and Naples)
which enables remote users to know about, train with and use contem-
porary software engineering tools.  During the session, a set of (com-
mercially available) tools will be used to instrument and test sample
Ada programs.  Relevant aspects about automatic testing of Ada programs
will also be presented.

For details including dates and local information please contact:  Ing.
Mario Fusani (fusani@vm.iei.pi.cnr.IT)

========================================================================
---------------------->>>  CALENDAR OF EVENTS  <<<----------------------
========================================================================

The following is a partial list of upcoming events of interest.  ("o"
indicates Software Research will participate in these events.)

   +  August 17-19: International Symposium on Software Testing and
      Analysis (ISSTA)
      Seattle, Washington
      Contact: Thomas Ostrand
      Siemens Corporate Research
      755 College Road East
      Princeton, NJ  08540
      Phone: [+1] 609-734-6569

   +  August 22-25: 1994 Software Engineering Symposium
      Pittsburgh, PA
      Contact: Wendy A. Rossi
      Software Engineering Institute
      Carnegie Mellon University
      Pittsburgh, PA  15213-3890
      Phone: [+1] 412-268-7388 (registration)
             [+1] 412-268-5800 (customer relations)

   o  September 21-22: Software World/USA
      Washington, D.C.
      Contact: Mark Longo
      Phone: [+1] 508-470-3870

   +  October 3-5: ASQC 4th International Conference on Software Quality
      Washington, D.C.
      Contact: Karen Snow
      Exhibit Coordinator
      PO Box 463
      Tiburon, CA  94920
      Phone: [+1] 415-388-1963

   o  October 4-6: UNIX Expo
      New York, NY
      Contact: Sheila Lillien or Claire Behrle
      Phone: [+1] 800-829-3976 (USA) or [+1] 201-346-1400

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line edition is forwarded on the 15th of each month to sub-
scribers via InterNet.  To have your event listed in an upcoming issue,
please E-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com".

The TTN On-Line submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted items should not exceed 68 lines (one page).
o  Publication of submitted items is determined by Software Research,
   Inc., and may be edited as necessary.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To request a FREE subscription or submit articles, please E-mail
"ttn@soft.com".  For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header.  To have
your name added to the subscription list for the quarterly hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.

To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.

Note:  To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                      San Francisco, CA 94107 USA

                         Phone: (415) 550-3020
                       Toll Free: (800) 942-SOFT
                          FAX: (415) 550-3030
                          E-mail: ttn@soft.com

                               ## End ##