sss ssss      rrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr


         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======            January 1995             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the world software testing commun-
ity.

(c) Copyright 1995 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.

TRADEMARKS:  Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc.  All other systems are either
trademarks or registered trademarks of their respective companies.

========================================================================

INSIDE THIS ISSUE:

   o  Call for Papers: Pacific Northwest Software Quality Conference
      (PNSQC '95)
   o  STW Product Seminars Scheduled
   o  What We Fail To Do In Our Current Testing Culture, by Tom Gilb
   o  Electronic STW Brochure Available
   o  A Not-So-Serious Press Release
   o  Testing Multi-Media Systems?
   o  Calendar of Events
   o  Correction
   o  TTN Submittal Policy
   o  TTN Subscription Information

========================================================================

             PACIFIC NORTHWEST SOFTWARE QUALITY CONFERENCE
                            CALL FOR PAPERS

PAPERS are solicited on all aspects of software quality. Abstracts
should be 2-4 pages long, with enough detail to give reviewers an under-
standing of the final paper, including a rough outline of its contents.
Indicate if the most likely audience is technical or managerial.
Relevance of the topic to software quality should be especially clear.

Submit 14 copies by February 15 to:  Peter Martin, Program Co-Chair
                                     Taligent, Inc.
                                     10201 N. De Anza Blvd.
                                     Cupertino, CA 95014

Questions:                           peter_martin@taligent.com
                              or     Margie Davis 503/294-4200
                                     mrd@plaza.ds.adp.com

========================================================================

                     STW PRODUCT SEMINARS SCHEDULED

         Come Learn the Most Efficient, Cost-Effective Ways to
                     Deliver High Quality Software.

If you're committed to producing the highest quality software achiev-
able, and ready to learn about state-of-the-art solutions for software
testing, mark your calendar for the Software Quality Through Test Auto-
mation seminars.  Open to software developers, QA managers, MIS direc-
tors, and software test engineers, this FREE seminar teaches you the
most efficient, cost-effective ways to clean up code and get your pro-
duct to market quickly.

Draw From the Knowledge of Industry Leaders.

The seminar will be presented by industry experts who will discuss each
step in the testing process.  They will cover techniques and tools for:

   -  Automating the test process.
   -  Establishing criteria which differentiate the characteristics of
      each testing tool.
   -  Developing test solutions for client/server environments, embedded
      environments, load generation and test management.
   -  Creating a strategy for cost/benefit analysis of automating your
      software testing process.

SR's next set of free product seminars will be held at locations around
the USA, as follows:

January 23-27:

      23 January 1995:        Orlando, FL
      24 January 1995:        Atlanta, GA
      25 January 1995:        Detroit, MI
      26 January 1995:        Chicago, IL
      27 January 1995:        Dallas, TX

February 21-24:

      21 February 1995:       Portland, OR
      22 February 1995:       San Francisco, CA
      23 February 1995 (AM):  El Segundo, CA
      23 February 1995 (PM):  Irvine, CA
      24 February 1995:       Phoenix, AZ

February 28 - March 3:

      28 February 1995:       Boston, MA
      1 March 1995:           Saddlebrook, NJ
      2 March 1995:           Baltimore, MD
      3 March 1995:           Tyson's Corner, VA

March 9:

      9 March 1995:           Santa Clara, CA


Software Developers, Engineering & QA Managers, and Software Test
Engineers Will Acquire...

   -  An understanding of how to evaluate testing tools.
   -  Insight in how to determine when enough testing has been done.
   -  Knowledge of the major testing strategies and how to implement
      them.
   -  A realistic approach to automated testing to improve quality and
      reduce cycle times.
   -  Confidence that yours can be the highest quality software
      available.
   -  A White Paper on implentation techniques.

Registration for all seminars (except the PM seminar in Irvine, CA)
starts at 7:30 AM.  Seminars start at 8:30 AM sharp, and run until
12:00, with a half-hour break at 9:45 AM (registration for the PM sem-
inar in Irvine starts at 1:30 with no lunch.  The seminar goes from 2:00
PM until 5:30 PM with a 30-minute break at 3:15 PM).

Hotel locations will be included in the registration package.

AGENDA
7:30-8:30       Registration and Complimentary Continental Breakfast
8:30-9:15       The Range of Automated Testing Tools:
                        features and benefits
9:15-9:45       Using tools in specific environments
9:45-10:15      Break
10:15-10:45     Strategy for Cost/Benefit Analysis
10:45-12:00     Live Demonstration
                SR's latest offering of solutions for:
                - Regression testing, including test management,
                  capture/playback and load generation
                - Coverage analysis and Static analysis

Limited Space; Register Today!

With all of this and much more at your disposal - at no charge - there's
every reason to attend our seminar.  Come learn how to control the soft-
ware quality process, and start producing the highest quality software
possible.  Call, or send your registration through e-mail today!  For
complete information contact SR at 415-550-3020, or e-mail us at
"info@soft.com".

========================================================================

           WHAT WE FAIL TO DO IN OUR CURRENT TESTING CULTURE

                              by Tom Gilb

                         What we fail to do...

Here are some of the things I see that we fail to do in practice, which
reduce both test effectiveness and quality of software.

1. We totally fail to systematically improve our testing processes, as
using Deming Statistical Process Control or IBM Defect Prevention Pro-
cess.

2. We fail to measure and qualify our test planning inputs, such as
requirements and design by using formal inspection to determine if the
defect level of our test planning inputs is 30 Major defects per 300
non-commentary words( the norm if not measured and improved) or 3 or 0.3
(which is closer to what it should be before we plan).

3. We fail to sample our test planning inputs, using inspection, to
determine if they meet our requirements for defect-freeness.

4. We fail to continuously improve our test planning rules with improve-
ments from the grass roots planning work.

5. We fail to quantify our most critical software product quality attri-
butes (portability, usability, availability, maintainability, adaptabil-
ity for example) and to test that they are as required and planned.

6. We fail to demand quantified experience data from our testing gurus
regarding the advice they give. They rarely tell us of the multiple
quality impacts of their recommended methods or the resources required.
This means that we are unable to compare and select appropriate test
strategies for our purposes. This is our own fault, since we do not have
the courage to demand this information from our test gurus and confer-
ence speakers. Are we allowing the blind to lead the blind?

7. We still continue with our 'bug per function point' fixation when our
real customers and systems demand a most realistic approach to software
quality such as mean time to failure, mean time to repair and availabil-
ity measures. We also fail to consider testing the entire spectrum of
quality attributes, while retaining a narrow fixation on reliability.

8. We continue with our 'waterfall model' fixation when we should be
looking closer at evolutionary testing strategies to give early and fre-
quent feedback on the total effect of system design and work process
selection on the total objectives of our project. We are too late with
too little.

                         What we need to do...

Here are some of the things we need to do to improve our current soft-
ware testing culture.

1. QUANTIFIED PRODUCT REQUIREMENTS. Define our software project products
in terms of multiple quantified testable objectives. Things like porta-
bility, maintainability, usability.

2. QUANTIFIED TEST OBJECTIVES. Define our test objectives in quantified
measurable trackable terms. Things such as test effectiveness, coverage
(we are good at theory of this but too many do not deal with the con-
cept), mean time to failure, cost to find and fix defects, remaining
defect density or frequency of occurrence, test work-hours per regres-
sion test.

3. MULTIPLE MEASURES AND FACTS ABOUT TEST TECHNOLOGY. Learn to evaluate
any

test strategy or technology in books, articles, conferences and consul-
tant advice in quantified experiential terms. What do they give us in
terms of meeting our quantified multiple objectives (above) and at what
resource costs (people, time, money) ?

4. DON'T TOLERATE DOGMATIC EXPERTS WITHOUT FACTS. Learn to tell "test
experts" who cannot or will not cite any benefit or cost experiences of
the methods they profess, not to bother us until they have some evidence
that they know what they are doing.

5. PROFILE SAMPLE TESTING. Learn to test using user profile representa-
tive samples  rather than exhaustive bug finding.

6. RAPID CYCLE TESTING PROCESS CONTROL. Learn to test in rapid cycles
(2% of project) of customer-useful, at least field trialable, increments
of functionality and/or quality improvement. Use the feedback to  con-
trol quality levels and the corresponding test strategies, software work
processes (such as formal inspections, Defect Prevention Process).

7. ROBUST SOFTWARE ENGINEERING. Learn to build robust software using
redundant techniques  and diagnostic code(like distinct software or N-
version programming) which is capable of detecting bugs and automating
testing and regression testing. This is useful for both a more advanced
testing process, and for operational control over remaining defects.

8. AVOID BAD INSPECTION APPLICATION.  Learn to do software Inspections
properly. Most institutions seem to screw up the use of inspections on a
large scale for a long time. They do not understand many fundamental
things like optimum checking rates, entry and exit control using meas-
ures of incoming and exiting document quality, and the proper use of
standards, checklists and source documentation.

9. INSPECT TEST AND ITS INPUTS. These inspections need to be done on all
forms of test planning and test case generation, as well as all inputs
to the test planning process (which include practically all software
documents including code, requirements, design, interface specification
and user documentation). Test planning is as weak as the poor quality
documents we use to do it, and without proper inspections they seem to
contain dozens of major defects per page! We don't even realize that
they are so polluted since we do not measure well enough to know.

10. CONTINUOUS IMPROVEMENT OF TEST. We need to learn proper continuous
improvement of test methods using Deming's statistical process control
or software variants such as IBM's Defect Prevention Process.

11. TOTAL BALANCE IN QUALITY INVESTMENT. We need to look at the software
quality problem from the total systems perspective so as to find the
right balance of investment in defect prevention, early defect detection
using inspection, various levels of testing and engineering robust sys-
tems, in order to meet both our short term (initial hand-over to users
and log term objectives (portability, adaptability and maintainability).
The testers of the world are not organizationally positioned to solve
this problem. It needs to be solved at a quality director level or
higher. We test far too much, and get far too little quality for it,
because we don't have this balance.

I realize that I have not taken the space to fully explain or justify my
assertions here. The intent is to be provocative of a refreshing discus-
sion in the test area. I'd sure love the chance to explain my position
in greater detail with facts and figures when we meet at the next
conference!

EDITOR'S NOTE:  This article was kindly contributed by Tom Gilb for use
in TTN.  He is a world-renowned lecturer and author specializing in
advanced methods for software engineering.  You can reach Tom Gilb at
Result Planning Ltd., 56 Fitzwilliam Square, Dublin-2 IRELAND, +47-280-
1697.  E-Mail: TomGilb@eworld.com

========================================================================

                   ELECTRONIC STW BROCHURE AVAILABLE

For readers interested in the latest details on Software TestWorks
(STW(tm)), send email to "info@soft.com" and ask to be sent a copy of
the new STW online brochure.  It is an e-mailable PostScript file, based
on our typeset company brochure.  When uudecoded on the receiving end
and printed out, it is a look at the Software Research suite of testing
tools and the benefits of using STW. It also contains a complete list of
Software Research's international distributors and how to reach them.

The file size is just under 407KB, and can be sent in one piece, or in
five parts, each less than 100KB.  When inquiring, please mention your
preference.

========================================================================

                           ARE BUGS FOR REAL?

   IMPORTANT NOTE: THE BELOW-GIVEN PRESS RELEASE MAY NOT BE SERIOUS!

New York, NJ, Dec. 8 -- People for the Ethical Treatment of Software
(PETS) announced today that seven more software companies have been
added to the group's "watch list" of companies that regularly practice
software testing.

"There is no need for software to be mistreated in this way so that com-
panies like these can market new products," said Ken Granola, spokesper-
son for PETS.  "Alternative methods of testing these products are avail-
able."

According to PETS, these companies force software to undergo lengthly
and arduous tests, often without rest for hours or days at a time.
Employees are assigned to "break" the software by any means necessary,
and inside sources report that they often joke about "torturing" the
software.

"It's no joke," said Granola. "Innocent programs, from the day they are
compiled, are cooped up in tiny rooms and dirty, ill-maintained comput-
ers, and are unceremoniously deleted when they're not needed anymore."

Granola said the software is kept in unsanitary conditions and is
infested with bugs.

"We know alternatives to this horror exist," he said, citing industry
giant Microsoft Corp. as a company that has become extremely successful
without resorting to software testing.

PETS is a nonprofit organization dedicated to improving the lives of
software programs and promoting alternatives to software testing.

========================================================================

                  ARE YOU TESTING MULTI-MEDIA SYSTEMS?

Is someone out there interested in testing Multi-Media (MM) systems?  To
us, testing an MM system appears to be q multi-dimensional task, poten-
tially involving such things as capture/playback technology, voice
recognition, speech synthesis, image capture and analysis, and all of
the very-complex related disciplines.

Right now, the MM systems one can get are not so complex that a great
deal of systematic, automated testing will be needed.  On the other
hand, as the complexity of MM grows the need for automation will
increase.

If you are working in this area, or if you *think* you'll be working in
this area, please contact "miller@soft.com" for details of our thinking.

-EFM

========================================================================
---------------------->>>  CALENDAR OF EVENTS  <<<----------------------
========================================================================

The following is a partial list of upcoming events of interest.  ("o"
indicates Software Research will participate in these events.)

   o  23-27 January:  FREE STW Product Seminars
   o  21-24 February:  FREE STW Product Seminars
   o  28 February - 3 March:  FREE STW Product Seminars
   o  9 March:  FREE STW Product Seminar
      (various US cities)
      contact: Software Research, Inc.
      tel: 415-550-3020
           800-942-SOFT (USA only)
      fax: 415-550-3030

   +  14-17 Feb: Client/Server
      Conference & Exhibition
      San Jose Convention Center
      San Jose, CA
      contact: Peter Brunold
      tel: 800-808-3976
      fax: 800-858-0412, 516-733-6753
      email: MJEAVONS@CMP.COM (Attn: Peter Brunold)

   o  14-16 Feb: Software Development '95
      Moscone Convention Center
      San Francisco, CA
      tel: 800-441-8826
           415-905-2784
      fax: 415-905-2222

   o  February 22-25: Uniforum 1995
      Dallas Convention Center
      Dallas, TX
      Contact: [+1] 800-545-EXPO

   +  March 1-2: Software World USA
      Chicago, IL
      Contact: Loretta Taylor
      tel: 508-470-3870

   +  March 7-10: Achieving Quality Software
      Sheraton Premiere
      Tyson's Corner, VA
      contact: Sherry Paquin
      email: sap0215@sperry.mhs.compuserve.com
      tel: 804-974-2078
      fax: 804-974-2480

========================================================================

                               CORRECTION

Last month, we published a list of our international distributors, with
one error:  the Email address for PVI Precision Software B.V.  in Neth-
erlands (our Benelux distributor) is ``100334.315@compuserve.com''.
Please make a note of the corrected Email address; sorry for the incon-
venience.

-Editor.
========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is forwarded on the 15th of each month to sub-
scribers via InterNet.  To have your event listed in an upcoming issue,
please e-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com".

The TTN On-Line submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted items should not exceed 68 lines (one page).
o  Publication of submitted items is determined by Software Research,
   Inc., and may be edited as necessary.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To request a FREE subscription or submit articles, please E-mail
"ttn@soft.com".  For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header.  To have
your name added to the subscription list for the quarterly hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.

To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.

Note:  To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                      San Francisco, CA 94107 USA

                         Phone: (415) 550-3020
                       Toll Free: (800) 942-SOFT
                          FAX: (415) 550-3030
                          E-mail: ttn@soft.com

                               ## End ##