sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======            December 1996            =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the worldwide software testing
community.

(c) Copyright 1996 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition
provided that the entire document/file is kept intact and this copyright
notice appears with it.

========================================================================

INSIDE THIS ISSUE:


   o  National Software Council, Press Release, December 1996.

   o  Call for Papers: Third International Workshop on Automated
      Debugging, Linkvping, Sweden, May 1997.

   o  Emphasizing Software Test Process Improvement, by Gregory T. Daich
      (Part 2 of 2)

   o  Has ISO 9000 Contributed to Quality?  Announcement of an
      Electronic Conference.

   o  CALL FOR PARTICIPATION -- 10th International Software Quality Week
      (QW'97)

   o  Twas the Night Before Implementation, a Holiday Saga.

   o  TTN SUBSCRIPTION INFORMATION


========================================================================

                       NATIONAL SOFTWARE COUNCIL

                         Press Release: 12/3/96

The National Software Council is an independent organization designed to
keep the US software industry strong and to help increase the industry's
contribution to national wellbeing.  The organization was formed in 1993
and has held workshops on licensing software professionals and US
software competitiveness.

At a meeting of its board of directors on November 14, 1996, the
National Software Council set its agenda for 1997 to focus on these
areas:  the value added of software; trustworthy software systems; and
software education.

The NSC is in a unique position to span the interests of industry, the
public, universities, the software profession and government.  Thinking
globally, sharing nationally and acting locally is the way the NSC
approaches these issues.

Lawrence Bernstein, a technologist with Price Waterhouse and a retired
Bell Laboratories executive was elected NSC President.  Other officers
include:  Alan Salisbury, Executive Vice President of Learning Tree
International, who was elected NSC Executive Vice President; Rick
Linger, Visiting Scientist of the Software Engineering Institute, who
was elected NSC 1st Vice President; Andrew Chruscicki, of the the Air
Force Rome Laboratory, who was elected NSC 2nd Vice President; Elliot
Chikofsky, of TRECOM, who was elected Treasurer; and Walter Ellis, an
expert in Software Process & Metrics, who was elected Secretary.

To find out about NSC activities, please consult the NSC WWW pages at:

                         http://www.CNSoftware.org

========================================================================

                            Call for Papers
          Third International Workshop on Automated Debugging
                    Linkvping, Sweden May 26-28 1997

Over the past decade automated debugging has seen major achievements.
However, as debugging is by necessity attached to particular programming
paradigms, the results are scattered. The aims of the workshop are to
gather common themes and solutions across programming communities, and
to cross-fertilize ideas. Original research as well as practical
experience reports are welcome.

Typical topics of the workshop include (but are not limited to):

   * automated debugging,
   * declarative debugging,
   * knowledge-based debugging,
   * algorithmic debugging,
   * assertion-based debugging,
   * software testing,
   * program slicing,
   * monitoring,
   * performance debugging,
   * trace analysis,
   * parallel and distributed debugging, and
   * debugging by simulation

To encourage discussion and exchange of ideas, the workshop will be
limited to at most 60 people. Contributors should be aware that the
prospective audience will not necessarily be familiar with the addressed
programming paradigms, which should, therefore, be briefly introduced.
Accepted papers will be included in the workshop proceedings. In 1995
the authors of the best papers were invited to submit journal versions
of their papers to a special issue of the Journal of Automated Software
Engineering on automated debugging. If the contributions to AADEBUG'97
are of sufficiently high quality we will try to make similar
arrangements this time.

Submission of Papers

Papers should be 2000 to 5000 words in length, including a 150 to 200
word abstract. Submit papers by sending a Ghostview-readable Postscript
file containing the paper to aadebug97@ida.liu.se. Alternatively, send
five copies of the paper by postal mail to the program chair;
concurrently send an e-mail to aadebug97@ida.liu.se containing the title
of the paper, names of the authors, full address of the correspondent
and a 150 to 200 word abstract of the paper.

Important Dates

   * Papers should be submitted before December 15, 1996
   * Notification of acceptance by February 15, 1997
   * Final version of paper before April 15, 1997

Program Chair

Mariam Kamkar
Department of Computer and Information Science
Linkvping University
S-581 83 Linkvping, Sweden
Phone: +46 13 281949
Fax: +46 13 284499
Email: marka@ida.liu.se

Program Committee Members (Preliminary)

Michel Bergire, University of Orleans, France
Bernd Bruegge, CMU, USA
Wlodzimierz Drabent, Polish Academy of Sciences, Poland
Mireille Ducasse, IRISA/INSA, France
Peter Fritzson, Linkvping University, Sweden
Keith Brian Gallagher, Loyola College, USA
Claude Jard, IRISA/CNRS, France
Thomas Johnsson, Chalmers university of Technology, Sweden
Lewis Johnson, USC ISI, USA
Mariam Kamkar, Linkvping University, Sweden
Bogdan Korel, Illinois Institute of Technology, USA
Jukka Paakki, University of Jyvaskyla, Finland
Boris Magnusson, Lund University
Nahid Shahmehri, Linkvping University, Sweden
Mary Lou Soffa, University of Pittsburgh, USA
Elaine Weyuker, AT&T Research Labs, N.J., USA
Roland Wism|ller, Technische Universitaet M|nchen, Germany

Registration and Local Arrangements Chair

David Byers
Department of Computer and Information Science
Linkvping University
S-581 83 Linkvping, Sweden
Phone: +46 13 284496
Fax : +46 13 284499
Email: davby@ida.liu.se

For further information see the WWW pages at:

        http://www.ida.liu.se/activities/conferences/AADEBUG/

The workshop is sponsored by University of Linkvping.

========================================================================

      Emphasizing Software Test Process Improvement (Part 2 of 2)

                           Gregory T. Daich

                   Software Technology Support Center

REQUIREMENTS-BASED TESTING

Have you experienced that uneasy feeling of not knowing the thoroughness
of testing for your software projects? Has the program logic been
adequately tested? Will we crash if the user enters an incorrect
response? Would you like some tools to help give insight into the level
of functional coverage that you may be achieving during testing? Have
you met your software requirements, and will the customer be pleased
with the product? These questions plague many organizations. I am always
bothered with the perspective that many people have when they assume
people should know how to test if they know how to program.  University
professors often do not spend adequate time to educate students on
effective testing practices. There is a substantial body of knowledge
that deals with effective testing practices that needs to be taught more
in our universities. Software testing has matured as a career path in
the last 10 to 15 years and requires training and experience to become
proficient. Many testers perform "inspired" testing and become
relatively effective without ever considering why their techniques find
bugs other than that is what their experience has shown. However,
inspired testing is usually inadequate. Though often effective at
finding some defects, inspired testing must be accompanied with a
systematic approach to achieve the required levels of functional
coverage. Furthermore, inspired testing is not repeatable.

The road map outlines an approach to adopt effective functional testing
techniques that can achieve the required levels of coverage of concerns.
The bottom line of this effort is to be able to investigate most likely
and most costly types of defects on a risk priority basis to increase
our confidence in system readiness for use.

CODE-BASED TESTING

The CMM advocates appropriate standards to develop and test software.
One of the most widely used standards for testing software is the
Institute of Electrical and Electronics Engineers (IEEE) Standard 1007-
1987, Software Unit Testing. Organizations who profess to follow this
standard must perform some level of code coverage analysis to assure
every statement is executed. Companies are being sued (and are losing)
for software development malpractice when they profess to exercise
responsible software engineering practices and cannot prove it [10].

Although the technology to perform code coverage analysis has been with
us for many years, few organizations have adopted practices to measure
code coverage [11]. Management commitment (have we heard this before?),
rather than available technologies, has proved the most significant
obstacle for implementing code coverage practices. Measuring branch
coverage (determining whether the true and false side of each decision
statement has been executed), has been identified as one of the most
effective software testing practices to improve the entire testing
effort [12].

Our road map outlines an approach to adopt appropriate levels of code
coverage practices at the unit, integration, and system testing levels.
This technology, like other testing technologies, is tool intensive, and
requires automation support to most effectively achieve desired goals.
Note that the road map addresses effective tool evaluation, selection,
and adoption activities.

REGRESSION TESTING

For those new systems and legacy systems that have a suite of tests to
perform regression testing, there still remains the difficult questions
about how much testing should be performed following a change and what
level of functional or code coverage do our regression tests actually
achieve. We also must attack these problems from the design and code
development side to produce or improve the level of independence of
components as much as possible. Building independent modules and code
segments can reduce the amount of regression testing to a more
reasonable level following changes to software systems.

Languages such as Ada have significantly reduced the level of regression
testing required for some legacy systems that have been reengineered in
Ada.  However, parameter and message passing can still cause downstream
problems in routines that use modified data. Ripple analysis, though
difficult, can increase our confidence in taking risks in not performing
all regression tests prior to delivery for certain types of changes.

Our road map addresses both the testing and the development sides of the
issue about performing adequate regression testing.

SPECIAL TESTING REQUIREMENTS

Although the road map does not currently address special testing
activities, e.g., testing object-oriented or client-server systems,
knowledgeable sources of information can be provided.

However, we recommend that organizations first consider improving their
testing disciplines outlined in the road map before spending a lot of
time addressing special, and often complicated, testing issues. Again,
we advocate a proper foundation prior to adopting specific practices
that require prerequisite practices for effective implementation. More
important, there are many common issues between traditional software
development projects and special testing requirements of object-oriented
or client-server systems. Note that highly trained specialists may be
required to meet the unique testing needs of some projects, e.g., a
significant amount of training and experience is required to test the
network aspects of client-server systems. Furthermore, some testing
technologies are relatively immature but are evolving as evidenced, for
example, by the recent publishing of the first book dedicated to testing
object-oriented systems [13].

                    Tables of Improvement Activities

The details for each stage of Figure 2 for adopting improved testing
practices are provided in Tables 1 and 2. If you recall, we likened our
test technology adoption guide to a travel guide for visiting a country.
Both the travel guide and the road map are required for a successful
venture. Although every organization will have unique needs, the tables
provide the essentials of a detailed itinerary for a successful journey
toward software test process improvement.

Table 1 lists common activities to improve all testing disciplines.
(When visiting a country, there usually are common sites that we may all
want to visit>capital, urban, and rural sites, culture-related
performances, etc.) Table 2 augments Table 1 with the specific
activities to address the unique issues of each discipline. (Obviously,
there will be unique sites that we simply must see to more fully
appreciate a particular country and culture.)

       ---------------------------------------------------------
       Table 1: Common Itinerary to Improve Testing Disciplines.
       ---------------------------------------------------------

* Review organization's goals in Strategic and Tactical Plans and
Mission and Vision Statements.

* Review risk management practices.

* Review document quality control practices.

* Determine resource assets and constraints (hardware, software,
personnel, budget ,etc.).

* Inspect relevant software project documentation.

* Coordinate with other process improvement initiatives.

* Document stimulus for improvement.

* Plan next phase.

* Obtain or renew sponsorship. Examine Practices.

* Receive appropriate training or orientation to examine practices in
depth.

* Assess test process improvement readiness.

* Assess test automation readiness.

* Assess test techniques and methods.

* Evaluate product quality.

* Define quantitative quality goals.

* Develop Technology Evaluation Plan.

* Present plan to sponsor. Evaluate Alternatives.

* Receive appropriate training or orientation to evaluate alternatives.

* Identify candidate improvement technologies.

* Evaluate candidates.

* Select technology enhancing solution(s).

* Develop Test Practices Improvement Plan.

* Present plan to sponsor. Enable Practices.

* Receive appropriate technical and change management training or
orientation to enable improvements.

* Pilot improvement to test practices.

* Measure quality and productivity changes.

* Develop or update training programs to support improvements.

* Roll out improvements to entire organization.  Evolve Technologies.
Receive appropriate training or orientation to conduct technology
evolution.

* Document lessons learned from prior activities.

* Update organization's goals, Strategic and Tactical Plans, and Mission
and Vision Statements.

* Conduct defect causal analysis.

* Identify potential improvement opportunities for next improvement
cycle.

     -------------------------------------------------------------
     Table 2: Specific Intineraries to Improve Testing Disciplines
     -------------------------------------------------------------

* Review adequacy of test strategies, goals, and objectives.

* Review overall adequacy of test plans and test planning practices.

* Review overall adequacy of test status reports and test tracking
practices.

* Identify potential STM concerns.

* Review adequacy of plans for RBT.

* Review adequacy of function testing coverage.

* Review adequacy of functional testing techniques.

* Identify potential RBT concerns.

* Review adequacy of plans for CBT.

* Review adequacy of code coverage.

* Review adequacy of unit and integration testing techniques.

* Identify potential CBT concerns.

* Review adequacy of plans for RT.

* Review adequacy of RT at all testing levels.

* Identify potential RT concerns. Examine Practices.

* Assess test requirements management practices.

* Assess test planning practices.

* Assess test tracking practices.

* Identify major STM concerns.

* Assess functional testing practices.

* Assess black-box testing techniques.

* Identify candidate RBT concerns.

* Assess code-based testing practices.

* Assess white-box testing techniques.

* Identify candidate CBT concerns.

* Assess RT practices at all testing levels.

* Assess test work product configuration control practices.

* Assess code structures and modules for independence.

* Identify candidate RT concerns. Evaluate Alternatives.

* Evaluate STM techniques for project application, test planning, and
other candidate STM techniques.

* Evaluate relevant tools>test planning tools, test management tools,
defect-tracking tools, source code static analyzers (code measurement
tools), etc.

* Evaluate black-box testing techniques for project application>boundary
value analysis and equivalence class partitioning, cause-effect
graphing, decision tables, specification-based path analysis, etc.

* Evaluate relevant tools>requirements-based test case generators,
coverage analyzers, test planning tools, test management tools,
capture-playback tools, etc.

* Evaluate white-box testing techniques for project application>basis
path analysis, branch path analysis, multicondition decision analysis,
etc.

* Evaluate relevant tools>coverage analyzers, test planning tools, test
management tools, capture-playback tools, etc.

* Evaluate RT techniques for project application>unit retesting,
regional impact testing, full regression testing, etc.

* Regression testing reduction techniques>evaluate relevant tools, test
planning tools, test management tools, capture-playback tools,
etc.Evolve Technologies

* Receive in-depth technology-specific training, test requirements
management, test planning (management aspects), tracking progress
(effort, defects, rework).

* Receive in-depth RBT technology-specific training.

* Develop test plans (functional testing aspects).

* Receive in-depth technology-specific training.

* Develop test plans (code-based testing aspects).

* Develop test plans (regression testing aspects). Enable Improvements.

* Monitor STM practices.

* Monitor RBT practices.

* Monitor CBT practices.

* Monitor RT practices.

REFERENCES

1.  Collard, Ross, "Systems Testing and Quality Assurance Seminar,"
seminar course notes, February 1996.

2.  Royer, Thomas C., "Software Test Management: Life on the Critical
Path," Prentice-Hall, 1993.

3.  Bender, Richard, "Requirements-Based Testing," seminar course notes,
Bender & Associates, 1994.

4.  Lee, Earl, "Software Inspections," seminar course notes, Loral Space
Information Systems, August 1995.

5.  Boehm, Barry, "Software Engineering Economics," Prentice-Hall, 1981.

6.  Beizer, Boris, "Software Testing Techniques," Van Norstand Reinhold,
1990.

7.  Gelperin, David, "The Power of Integrated Methods During the
Development of Software Components," CrossTalk, STSC, Hill Air Force
Base, Utah, July 7, 1994, pp. 20-23.

8.  Marick, Brian, "The Craft of Software Testing," Prentice Hall, 1995.

9.  Gilb, Tom, "The Results Method," draft manuscript from author, 1996.

10.  Bender, Richard, "Software Liability, Is a Lawsuit in Your Client's
Future?" presentation to American Bar Association, 1993.

11.  Software Quality Engineering, "Software Measures and Practices
Benchmark," SingleSource, 1991.

12.  Beizer, Boris, "Testing Computer Software Conference," Panel
Discussion, June 1994.

13.  Seigel, Shel, "Object-Oriented Testing, an Hierarchical Approach,"
John Wiley and Sons, 1996.

14.  Daich, Gregory T., et al., "Software Test Technologies Report,"
Software Technology Support Center, August 1994.

15.  IEEE-1348, "Guidelines for CASE Tool Adoption, 1995.

16.  Software Engineering Institute, "IDEAL Model," 1995.

17.  IEEE-1209, "Standard for CASE Tool Evaluation and Selection," 1992.

========================================================================

                  Has ISO 9000 Contributed to Quality?

                  Announcement: Electronic Conference

    URL: http:www.mcb.co.uk/services/conferen/nov96/tqm/conhome.htm


ISO 9000 (previously BS 5750) has been promulgated as a quality standard
in the UK since 1979, but has it contributed to quality and performance
improvement in those organisations which have become registered?

The standard has been subject to two reviews and a further review is
promised in the year 2000. The purpose of this electronic conference is
to discuss the contribution ISO 9000 has made to business performance
and thus contribute to the review.

The starting point for debate is "ISO 9000 -- Three Case Studies,"
published by John Seddon (available at URL:
http://www.vanguardconsult.co.uk).  This is a critical view of what
happened to three organisations when they registered to ISO 9000, Seddon
argues that registration to ISO 9000 is a guarantee of sub-optimising
performance.

Interested parties can contact Mr. Seddon at
"john@vanguardconsult.co.uk".

========================================================================

         TENTH INTERNATIONAL SOFTWARE QUALITY WEEK 1997 (QW`97)

              Conference Theme: Quality in the Marketplace

            San Francisco, California USA -- 27-30 May 1997

QW`97 is the tenth in a continuing series of International Software
Quality Week Conferences focusing on advances in software test
technology, quality control, risk management, software safety, and test
automation.  Software analysis methodologies, supported by advanced
automated software test methods, promise major advances in system
quality and reliability, assuring continued competitiveness.

The mission of the QW`97 Conference is to increase awareness of the
importance of software quality and methods used to achieve it.  It seeks
to promote software quality by providing technological and educational
opportunities for information exchange within the software development
and testing community.

The QW`97 program consists of four days of mini-tutorials, panels,
technical papers and workshops that focus on software test automation
and new technology.  QW`97 provides the Software Testing and QA/QC
community with:

   o  Analysis of method and process effectiveness through case studies.
   o  Two-Day Vendor Show
   o  Quick-Start, Mini-Tutorial Sessions
   o  Vendor Technical Presentations
   o  Quality Assurance and Test involvement in the development process
   o  Exchange of critical information among technologists
   o  State-of-the-art information on software test methods

QW`97 is soliciting 45 and 90 minute presentations, half-day standard
seminar/tutorial proposals, 90-minute mini-tutorial proposals, or
proposals participation in a panel and "hot topic" discussions on any
area of testing and automation, including:

      Cost / Schedule Estimation
      ISO-9000 Application and Methods
      Test Automation
      CASE/CAST Technology
      Test Data Generation
      Test Documentation Standards
      Data Flow Testing
      Load Generation and Analysis
      SEI CMM Process Assessment
      Risk Management
      Test Management Automation
      Test Planning Methods
      Test Policies and Standards
      Real-Time Software
      Real-World Experience
      Software Metrics in Test Planning
      Automated Inspection
      Reliability Studies
      Productivity and Quality Issues
      GUI Test Technology
      Function Point Testing
      New and Novel Test Methods
      Testing Multi-Threaded Code
      Integrated Environments
      Software Re-Use
      Process Assessment/Improvement
      Object Oriented Testing
      Defect Tracking / Monitoring
      Client-Server Computing

IMPORTANT DATES:

      Abstracts and Proposals Due:            15 December 1996
      Notification of Participation:          1 March 1997
      Camera Ready Materials Due:             15 April 1997

FINAL PAPER LENGTH:

      Papers should be limited to 10 - 20 pages, including Text, Slides
      and/or ViewGraphs.

SUBMISSION INFORMATION:

      Abstracts should be 2-4 pages long, with enough detail to give
      reviewers an understanding of the final paper, including a rough
      outline of its contents. Indicate if the most likely audience is
      technical, managerial or application-oriented.

      In addition, please include:
         o  A cover page with the paper title, complete mailing and
            Email address(es), and telephone and FAX number(s) of each
            author.
         o  A list of keywords describing the paper.
         o  A brief biographical sketch of each author.

      Send abstracts and proposals including complete contact
      information to:

      Ms. Rita Bral
      Quality Week '97 Director
      Software Research Institute
      901 Minnesota Street
      San Francisco, CA  94107 USA

      For complete information on the QW'97 Conference, send Email to
      qw@soft.com, phone SR Institute at +1 (415) 550-3020, or, send a
      FAX to SR/Institute at +1 (415) 550-3030.

========================================================================

                    THE NIGHT BEFORE IMPLEMENTATION
                            (Author unknown)

                  Twas the night before implementation
                       and all through the house,
                       not a system was working,
                           not even a mouse.

            The programmers hung by their tubes in dispair,
              in hopes that a miracle soon would be there.

             The users were nestled all snug in their beds,
          while visions of transactions danced in their heads.

              When out of the Monitor came such a clatter,
           I sprang from my desk to see what was the matter.

              And what to my wondering eyes should appear,
            but a guru programmer (with a sixpack of beer).

               His resume glowed with experience so rare,
        and he turned out great code with a bit-pusher's flair.

            More rapid than eagles, his programs they came,
          as he whistled and shouted and called them by name:

               On Update! On Inquiry! On OOP! On Delete!
              On Sequel! On TimeOut! On Methods Complete!

          His eyes were glazed-over; fingers nimble and lean,
             from weekends and nights in front of a screen.

              A wink of his eye, and a twist of his head,
              soon gave me to know I had nothing to dread.

          He spoke not a word, but went straight to his work,
           Turning Specs into code; Then turned with a jerk;

               And laying his finger upon the "RUN" key,
             The system came up and it worked *perfectly*.

              The Updates updated; Deletes, they deleted;
             The Inquires inquired, the Closing completed.

              He tested each whistle, and tested each bell
               with nary an edit, for all had gone well.

           The system was finished, the tests were concluded.
             The client's last changes were even included.

          And the client exclaimed with a snarl and a taunt.
           "IT'S JUST WHAT I ASKED FOR, BUT NOT WHAT I WANT!"

Thanks for forwarding this item to Marshall D. Abrams
(mabrams@capaccess.org).

========================================================================

          TTN Online Edition -- Mailing List Policy Statement

Some subscribers have asked us to prepare a short statement outlining
our policy on use of E-mail addresses of TTN-Online subscribers.  This
issue, and several other related issues about TTN-Online, are available
in our "Mailing List Policy" statement.  For a copy, send E-mail to
ttn@soft.com and include the word "policy" in the body of the E-mail.

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is Emailed the 15th of each month to subscribers
worldwide.  To have your event listed in an upcoming issue Email a
complete description and full details of your Call for Papers or Call
for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).  Longer articles are OK and may be serialized.
o  Length of submitted calendar items should not exceed 60 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters and TTN-Online disclaims any responsibility for their
content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-ONLINE, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, send E-mail to "ttn@soft.com".

TO SUBSCRIBE: Include in the body of your letter the phrase "subscribe
".

TO UNSUBSCRIBE: Include in the body of your letter the phrase
"unsubscribe ".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA  USA

                   Phone:          +1 (415) 550-3020
                   Toll Free:      +1 (800) 942-SOFT (USA Only)
                   FAX:            +1 (415) 550-3030
                   E-mail:         ttn@soft.com
                   WWW URL:        http://www.soft.com


                               ## End ##