sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======            January  1997            =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the worldwide software testing
community.

(c) Copyright 1997 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition
provided that the entire document/file is kept intact and this copyright
notice appears with it.

========================================================================

INSIDE THIS ISSUE:

   o  Certified Software Quality Engineer (Exam Requirements)

   o  Quality Week 1997: Mark Your Calendars Now!

   o  Call for Participation: Metrics97

   o  Software Quality HotList Available

   o  Information Warfare Protection Is Urged by Defense Department"

   o  TCAT for Java (tm) -- Beta Version Available

   o  A Testplan for Drinking

   o  TTN-Online Archive

   o  CORRECTION: Missing Figures for G. Daich's Article

   o  Three Engineers' Approach to the World

   o  Evaluating TTN-Online

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

               CERTIFIED SOFTWARE QUALITY ENGINEER (CSQE)

     Editors Note: This E-mail was on a new group and Nick Stewart
     responded with the pointer not only to the CSQE website but
     also with the body of requirements.  Makes you wonder if you
     know enough to do software quality at all?)

The "best" way to find out `all you ever wanted to know' about the CSQE
is to write or call the ASQC and request brochure B0110. Or, log on to
the Software Division Web page at

http://www.asqc.org/membinfo/divisions/softdiv/swqweb.html

I have enclosed some information to get you started.  1) exam
requirements (partial list) 2) the Body of Knowledge 3) and a selected
bibliography (reading list)

I know that this will get reformatted in some totally repulsive way by
the time it gets to you. I have also attached the files in case you have
a MIME compliant mail application, or you can uudecode or what ever.
(If there is an option which I left out and you prefer it, let me know)

If there is anything else I can do for you, let me know.

                              Nick Stewart
                      CSQE Certification Chairman
               Senior Past Chairman - Software Division
                       nstewart@ccmail.dsccc.com

       o       o       o       o       o       o       o       o

                    REQUIREMENTS TO SIT FOR THE EXAM

Must have 8 years work experience in at least one area of the body of
knowledge for the certified software quality engineer.  A portion of the
work experience may be offset by education; a bachelors degree equals 4
years, a masters degree equals 5 years.  The remaining 3 years must have
been in a decision making position, engineer, manager etc.  ASQC
membership is  not a requirement but, non-members will be charged a
non-member fee. This fee also entitles the participant to a years
membership if they choose.

       o       o       o       o       o       o       o       o

         CERTIFIED SOFTWARE QUALITY ENGINEER: BODY OF KNOWLEDGE

I. General Knowledge, Conduct and Ethics (24 questions) A. Standards

1. Domestic and international standards and specifications (e.g., ISO
9000, IEEE, Human Factors and Ergonomics Society, graphical user
interface guidelines)
2. Software quality and process initiatives, ventures, and consortia
(e.g., SEI, SPICE, bootstrap, ESPRIT)

B. Quality Philosophies and Principles

1. Benefits of software quality
2. Quality philosophies (e.g., Juran, Deming, Crosby)
3. Prevention vs. Detection philosophies
4. Software Total Quality Management principles and applications
5. Organization and process benchmarking (i.e., identifying, analyzing,
and modeling best practices)

C. Organizational and Interpersonal Techniques

1. Verbal communication and presentation
2. Written communication
3. Effective listening
4. Interviewing
5. Facilitation (e.g., team management, customer-supplier relationships)
6. Principles of team leadership and facilitation
7. Meeting management
8. Conflict resolution
9. Organization and implementation of various types of quality teams

D. Problem-Solving Tools and Processes

1. Root cause analysis
2. Tools (e.g., affinity diagram, tree diagram, matrix diagram,
interrelationship digraph, prioritization matrix, activity network
diagram)
3. Risk management (e.g., project, product, process)
4. Problem-solving processes

E. Professional Conduct and Ethics

1. ASQC Code of Ethics
2. Conflict of interest issues for a software quality engineer
3. Ethical issues involving software product licensing
4. Legal issues involving software product liability and safety (e.g.,
negligence, customer notification, recall, regulations)

II. Software Quality Management (16 questions)

A. Planning

1. Product and project software quality goals and objectives
2. Customer requirements for quality
3. Quality and customer support activities
4. Issues related to software security, safety, and hazard analysis

B. Tracking

1. Scope and objectives of quality information systems
2. Categories of quality data and their uses
3. Problem reporting and corrective action procedures (e.g., software
defects, process nonconformances)
4. Techniques for implementing information systems to track quality-
related data
5. Records and data collection, storage, maintenance, and retention

C. Organizational and Professional Software Quality Training

1. Quality training subject areas (e.g., inspection, testing,
configuration management, project management)
2. Available training resources, materials, and providers
3. Professional societies, technical associations, and organizations for
software quality engineers

III. Software Processes (24 questions)

A. Development and Maintenance Methods

1. Software development procedures
2. Life cycle or process models (e.g., waterfall, spiral, rapid
prototyping)
3. Defect prevention, detection, and removal methods
4. Requirement analysis and specification methods (e.g., data flow
diagram, entity-relationship diagram)
5. Requirements extraction methods and techniques (e.g., Quality
Function Deployment, Joint Application Development, context-free
questioning, needs analysis, focus groups)
6. Software design methods (e.g., structured analyses and design,
Jackson Design method, Warnier-Orr method, object-oriented)
7. Issues related to reuse, re-engineering, and reverse engineering
8. Maintenance processes (e.g., re-engineering, reverse engineering,
change management, retirement)

B. Process and Technology Change Management

1. Software process and technology change management theory and methods
2. Process maturity models
3. Software process assessment and evaluation techniques
4. Software process modeling (e.g., entry and exit criteria, task
definition, feedback loops)
5. Software environments (e.g., development methodologies, tools, data,
infrastructure)
6. Barriers to the implementation or success of quality improvement
efforts and quality systems

IV. Software Project Management (16 questions)

A. Planning

1. Project planning factors (e.g., quality, costs, resources,
deliverables, schedules)
2. Project planning methods and tools (e.g., work breakdown structures,
documentation, forecasting, estimation)
3. Goal-setting and deployment methodologies
4. Maintenance types (e.g., corrective, adaptive, perfective)
5. Software maintenance and adaptability program planning
6. Supplier management methodologies

B. Tracking

1. Phase transitioning control techniques (e.g., reviews and audits,
Gantt Charts, PERT, budgets)
2. Methods of collecting Cost of Quality data
3. Cost of Quality categories (e.g., prevention, appraisal, internal
failure, external failure)
4. Cost, progress, and deliverable tracking (e.g., status reports, life
cycle phase reports)

C. Implementation

1. Project management tools (e.g., planning, tracking, cost estimating,
reporting)
2. Methods of reporting Cost of Quality data
3. Trade-off involved in product release decisions (e.g., cost, quality,
schedule, customer, test sufficiency, stability)

V. Software Metrics, Measurement and Analytical Methods (24 questions)

A. Measurement Theory

1. Goal, Question, Metric paradigm for selecting metrics
2. Basic measurement theory and techniques
3. Definitions of metrics and measures
4. Designing measures
5. Psychology of metrics (e.g., how metrics affect people and how people
affect metrics)

B. Analytical Techniques

1. Issues involving data integrity, completeness, accuracy and
timeliness
2. Basic statistical concepts and graphical techniques for analysis and
presentation of software data (e.g., distributions, confidence
intervals, statistical inference)
3. Quality analysis tools (Pareto Chart, Flowcharts, Control Charts,
Check Sheets, Scatter Diagrams, Histograms)
4. Sampling theory and techniques as applied to audits, testing, and
product acceptance

C. Software Measurement

1. Prediction techniques of future maintainability
2. Applications of measurements to process, product, and resources
3. commonly used metrics (.e.g., complexity, reliability, defect
density, phase containment, size)
4. Software quality attributes (e.g., reliability, maintainability,
usability, testability)
5. Defect detection effectiveness (e.g., cost yield, escapes, customer
impact)

VI. Software Inspection, Testing, Verification and Validation (24
questions)

A. Inspection

1. Inspection types (e.g., peer reviews, inspections, walk-throughs)
2. Inspection process (e.g., objectives, criteria, techniques and
methods, participant roles)
3. Inspection data collection, reports, and summaries
4. Methods for reviewing inspection efforts (e.g., technical
accomplishments, resource utilization, future planning)

B. Testing

1. Types of tests (e.g., functional, performance, usability, stress,
regression. real-time response)
2. Test Levels (e.g., unit, integration, system, field)
3. Test strategies (e.g., top down, bottom up, automated testing, I/0
first, beta testing, black box, white box)
4. Test design (e.g., test cases, fault insertion and error handling,
equivalence class partitioning, usage scenarios, customer defect
reports)
5. Test coverage of code (e.g., branch-to-branch, path, individual
predicate, data)
6. Test coverage of specifications (e.g., functions, states, data and
time domains, localization, internationalization)
7. Test environments (e.g., tools and methodologies, test libraries,
drivers/stubs, equipment compatibility test laboratories)
8. Test documentation (e.g., test plans, logs, test designs, defect
recording, test reports)
9. Test management (e.g., scheduling, freezing, resources, dependencies,
analysis of test results)
10. Methods for reviewing testing efforts (e.g., technical
accomplishments, resource utilization, future planning, risk management)
11. Methods for testing supplier components and products
12. Methods for testing the accuracy of customer deliverables including
user documentation, marketing and training materials
13. Traceability mechanisms (e.g., system verification diagrams)

C. Verification and Validation (V & V)

1. V & V planning procedures
2. Methods for reviewing V & V program (e.g., technical accomplishments,
resource utilization, future planning, risk management, impact analysis
of proposed changes)
3. Methods for evaluating software life cycle products and processes
(e.g., physical traces, documentation, source code, plans, test and
audit results ) to determine if user needs and project objectives are
satisfied
4. Methods for performing requirements traceability (e.g., requirements
to design, design to code)
5. Methods for evaluating requirements for correctness, consistency,
completeness, and testability
6. Methods for evaluating interfaces with hardware, user, operator, and
other software applications
7. Methods for evaluating test plans (e.g., system acceptance,
validation) to determine if software satisfies software and system
objectives
8. Methods for evaluating the severity of anomalies in software
operation
9. Methods for assessing all proposed modifications, enhancements, or
additions to determine the effect each change will have on the system
10. Methods for determining which V&V tasks should be iterated based
upon proposed modifications and enhancements

VII. Software Audits (16 questions)

A. Audit Types

1. Performing internal audits (e.g., quality system, product, process,
project, customer)
2. Performing external audits (e.g., supplier qualifications,
certification of supplier systems, auditing testing done by independent
agencies)
3. Functional and physical configuration audits

B. Audit Methodology

1. Purpose, objectives, frequency, and criteria of the overall audit
program and individual software audits
2. Procedures, tools, and issues related to conducting audits in
specific areas (e.g., software development, project management,
configuration management)
3. Audit steps (planning, preparation, execution reporting, corrective
action, verification, follow-up)
4. Audit process (e.g., objectives, criteria, techniques and methods,
participant roles)

C. Audit Planning

1. Audit team member responsibilities
2. Management (auditee and auditor) responsibilities concerning audits
3. Hosting external audits
4. Audit program development and administration
5. Auditing requirements (e.g., industry and government standards)

VIII. Software Configuration Management (16 questions)

A. Planning and Configuration Identification

1. Technical and managerial factors that guide software product
partitioning into configuration items and components
2. Release process issues (e.g., supporting multiple versions, feature
vs. corrective releases, hardware and software dependencies)
3. Library control procedures
4. Configuration identification methods (e.g., schemes,
reidentification, naming conventions, versions and serialization,
baselines)
5. Configuration management tools

B. Configuration Control, Status Accounting, and Reporting

1. Documentation control (e.g., issuing, approval, storage, retrieval,
revision)
2. Patching issues (e.g., testing, traceability, source updating)
3. Trade-offs between cost, cycle time, and integrity of software
product and rigor and formality of change control
4. Source and object code control procedures
5. Software configuration/change control board processes
6. Techniques for assessing impacts of proposed software changes

========================================================================

            10th Annual International Software Quality Week
                         Sheraton Palace Hotel
                     San Francisco, California  USA

                             27-30 May 1997

Mark your calendars now!  This 10th anniversary Quality Week Conference
promises to be the strongest one yet -- paper and tutorial submissions
and proposals are 50% above last year's level.  It's sure to be a
sellout!

Details about Quality Week are available at:

        http://www.soft.com/QualWeek

For details you can also send E-mail to qw@soft.com.

========================================================================

           Fourth International Symposium on Software Metrics
                             (Metrics '97)

                      Albuquerque, New Mexico USA
                     November 5 - November 7, 1997

                 THEME: Evaluating Software Technology

                         CALL FOR PARTICIPATION

The goal of Metrics'97 is to place measurement in the larger context of
experimentation and empirical evaluation. Topics of particular interest
are those that relate to the application of quantitative methods to
problem-solving and evaluating the impact of software engineering
technology. This symposium will be co-located with the International
Symposium on Software Reliability (ISSRE '97) to facilitate joint
participation and cross-fertilization of ideas.

Submission of research papers, experience reports, and panel or tutorial
proposals is encouraged. The primary focus of any submission should be
the application of quantitative methods to evaluating software
technology, particularly software engineering processes, tools, and
paradigms. In your submission, please make clear the problem that you
are addressing with quantitative methods. For example, you may indicate
that your work tries to answer the question, "Does the use of a
particular testing process shorten the time-to- market and/or increase
software reliability?" Papers on related topics are also invited. These
areas may include (but are not limited to):

- - - software attributes for evaluating software reliability
- - - foundations of evaluation and measurement
- - - development of new measures
- - - metric validation
- - - metrics for object-oriented development
- - - measurement-based process definition
- - - process evaluation
- - - practical applications of measurement
- - - quality measurements
- - - support for quality standards
- - - experiences with metrics programs

Full papers should not exceed 12 single-spaced pages and should include
an abstract and list of keywords. Research papers will be evaluated for
originality, significance, soundness and clarity. Experience reports
should be 4-6 pages. Experience reports will be evaluated for
significance of lessons learned and can describe either positive or
negative results.

Panel proposals should include the title, proposed chair, panelists and
a one-page description of the subject for the panel. Panelists must have
agreed to participate prior to the submission of the panel proposal.

Tutorial proposals should include a detailed outline of the material,
indicate the length of the tutorial and describe the background and
experience of the instructor. Please note if the tutorial has been
offered elsewhere, and how the tutorial will be tailored to the audience
at Metrics'97.

GENERAL CHAIR:

     Jim Bieman
     Colorado State University
     bieman@cs.colostate.edu
     phone: 970.491.7096
     fax:   970.491.2466

PROGRAM CO-CHAIRS:

     Shari Lawrence Pfleeger
     Systems/Software, Inc.
     s.pfleeger@ieee.org
     phone:  202.244.3740

     Linda Ott
     Michigan Technological Institute
     linda@cs.mtu.edu
     phone: 906.487.2209
     fax:   906.487.2283


INFORMATION FOR AUTHORS:

     Abstracts               February 1, 1997
     Submission deadline     March 1, 1997
     Acceptance Notification June 15, 1997
     Camera-ready papers due August 1, 1997

Abstracts should be submitted via email by February 1, 1997 as an ASCII
text message with the words Metrics Abstract on the subject line to

     linda@cs.mtu.edu

Papers and proposals may be submitted either via email or regular postal
mail.

Email submissions should be sent to:

     linda@cs.mtu.edu

For e-mail submissions of research and experience papers, submit a
postscript file appended to the Metrics '97 submission form. (Or e-mail
a message to linda@cs.mtu.edu, with the words Metrics Form). Panel and
tutorial proposals may be emailed as an ASCII text message. All emailed
submissions should include the words "Metrics Submission" in the subject
line of the email.

Postal submissions should include five (5) copies and be sent to:

     Linda M. Ott
     Department of Computer Science
     Michigan Technological University
     1400 Townsend Drive
     Houghton, MI 49931
     USA

SPONSORED BY (pending):

     IEEE Technical Council on Software Engineering
     IEEE-CS TCSE Committee on Quantitative Methods

     in cooperation with:
     The International Symposium on Software Reliability
     (ISSRE '97)

WEB SITE:

     http://www.cs.pdx.edu/conferences/metrics97/

PROGRAM COMMITTEE:

Victor Basili, University of Maryland
Lionel Briand, Fraunhofer Institute for Experimental Software Engineering
Ross Jeffery, University of New South Wales, Australia
Barbara Kitchenham, Keele University, UK
Anneliese von Mayrhauser, Colorado State University
Walcelio Melo, Centre pour Recherche de l'Informatique de Montreal, Canada
Paul Oman, University of Idaho
Shingo Takada, Nara Institute of Technology, Japan
June Verner, City University of Hong Kong
Marvin Zelkowitz, University of Maryland

========================================================================

               NEW SOFTWARE QUALITY HOTLIST(tm) AVAILABLE

Workers in the software quality area are invited to take advantage of
the Software Quality HotList(tm), with > 425 inks to world's main
technical resources in software quality and software testing.  Feedback
has been extremely positive and many are getting good use of this
compendium.

The URL for the Software Quality HotList is:

        http://www.soft.com/Institute/HotList

Many new links have been added in the past three weeks, largely as the
result of readers suggestions.  Send us word of additional hotlinks,
plus any corrections or modifications to any hotlinks already on the
lists, are welcome at "info@soft.com".

There is a particular need for more links in the "Software Quality and
Test Technology Topics" section!  All suggestions are welcome.

========================================================================

       INFORMATION WARFARE PROTECTION URGED BY DEFENSE DEPARTMENT

                           By Thomas E. Ricks
               Staff Reporter of THE WALL STREET JOURNAL

WASHINGTON -- A Defense Department panel, in an unusually strident
report, recommended $3 billion of additional spending over the next five
years to improve the security of the nation's telecommunications and
computing infrastructure. Warning of a possible "electronic Pearl
Harbor," the task force appointed by the Defense Science Board also said
the Pentagon should seek the legal authority to launch counterattacks
against computer hackers "There is a need for extraordinary action," the
board's task force on Information Warfare-Defense" stated in a report
that was quietly released on Friday.  Current practices and assumptions,
it said, "are ingredients in a recipe for a national security disaster."
The report also predicts that by the year 2005, attacks on U.S.
information systems by terrorist groups, organized criminals and foreign
espionage agencies are likely to be "widespread."

Overall, the report amounts to a manifesto for warfare in the
information age. It notes that in the agricultural age, military
campaigns sought to gain control of land, while in the industrial age,
campaigns focused on an adversary's means of production. In today's
information age, it continues, "Military campaigns will be organized to
cripple the capacity of an information-based society to carry out its
information-dependent e enterprises" -- which, it said, now engage 60%
of the U.S. work force.  Where the U.S. Can Be Hit The U.S. military has
2.1 million computers and 10,000 local-area networks, the report said.
Moreover, the American telecommunications, electric power, banking and
transportation industries now are vulnerable to attack by anyone seeking
to confront the U.S. without confronting its military, the report
warned. "We have built our economy and our military on a technology
foundation that we do not control and, at least at the fine detail
level, we do not understand," it said.

The study, which was conducted by a panel of 20 computer and
information-systems experts drawn from the military, industry and
academia, sharply criticizes current Pentagon efforts as inadequate.
Among its 13 commendations are calls for a new "information-warfare"
czar in the Defense Department and an "information-warfare" center
within the U.S.  intelligence agencies. It also says the military should
begin using a new five-level warning system under which, as attacks are
detected, the information systems of the national-security establishment
would first be monitored more closely and ultimately be disconnected
from outside information systems. It also calls for spending $580
million in the coming years on research and development, mainly in the
private sector, to develop new software and hardware to provide
security, such as a system for automatically tracing hacker attacks back
to their origins. The report largely sidesteps as irrelevant the
continuing controversy about the export of encryption codes.

Most strikingly, the task force recommends that the Pentagon be given
the legal ability to repel and pursue those who try to hack into its
computer systems.  Defense Department systems, it said, should carry
warnings to make clear the department's "presumption that intruders have
hostile intent and warn that the department will take the appropriate
response." The task force's chairman, Duane Andrews, a former assistant
defense secretary for command, control, communications and intelligence,
said in an interview that current law doesn't permit counterattacks
against computer hackers. He said he would like to see the law changed
to allow the Pentagon to respond by injecting the attackers' computers
with "a polymorphic virus that wipes out the system, takes it down for
weeks."

Over the past two years, Mr. Andrews added, the Defense Department's
information systems have been subjected to attacks that have been "very
sophisticated and massive." Under current law, the Pentagon hasn't been
able to investigate whether these attacks may have been coordinated, he
said. Mr Andrews currently is executive vice president, corporate
development, Science Applications International Corp., a defense and
technology company based in San Diego.

Asked about the study's criticisms of the Pentagon's current security
efforts, spokesman Kenneth Bacon said, "We share these concerns, and we
have a number of programs under way to reduce the threat to our
information-warfare infrastructure." Among other things, he noted that
the Defense Department's Advanced Research Projects Agency is attempting
to develop an "electronic immune system," a computer program that acts
like the human body's immune system to detect invaders and mobilize
against them. The Pentagon already spends about $1.6 billion a year to
protect the security of its information systems.

"Most of the stuff in there is a message to industry, too," Mr. Andrews,
the task force chairman, said of the report. "A large international bank
has exactly the same problems and challenges as the Defense Department."

========================================================================

              TCAT for Java (tm) -- Beta Version Available

We will soon have TCAT for Java (tm) on UNIX available in Beta release.
This product is a variant of our standard TCAT coverage analyzer, and
includes Java method hierarchy display, Java method digraph display,
combined branch and call-pair coverage measurement, and full direct
connection to coverage-annotated Java source code.

An important new feature of this product is the ability to: (a) collect
coverage data locally for your test suite (using the local Java applet
viewer or a local copy of your browser); or, (b) to collect coverage
data REMOTELY on copies of a candidate applet as actually used by
clients on the WWW.

If you are doing Java applet development and are concerned that your
Java code is bulletproof you may wish to consider using TCAT for Java to
make sure your testing is complete enough.

If would like to receive an early copy of TCAT for Java please send E-
mail ASAP to sales@soft.com and be sure to indicate the UNIX platform
and OS version you are running.  Please include your CID number if you
are already a TestWorks user.

========================================================================

                        A TESTPLAN FOR DRINKING

               From: John Favaro 
                       Subject: Drinking Standard

Our quality manager had this posted.  At least he has a sense of humor.

Drinking a glass of water standard (Issue 2.1)

1. Purpose of the action

2. Applicable documents

      - Drinking Liquids Guidelines
      - IEEE Drinking Standard

3. Reference documents

      - Drinking a Coke standard
      - The Art of Drinking, Sam De Marco

4. Drinking steps

4.1   Reach for the glass with either your left or right hand, according
      to a predefined standard.

4.2   Take the glass.

4.3   Liquid inspection Review (LI/R)
      (for safe drinking this activity shall be carried out by an
      independent observer).

4.4   Actual drinking implementation.

5. Management

5.1   Make a drink plan.

5.2   Drinking risk assessment.

========================================================================

                           TTN-ONLINE ARCHIVE

Readers may wish to take advantage of the fact that TTN-Online is now
archived on our WWW site.

We have put the prior three years issues of TTN-Online, all of the
issues virtually since the Online Version of the Newsletter began, at
"http://www.soft.com/TTN-Online".

========================================================================

                 MISSING FIGURES FOR G. DAICH'S ARTICLE

Our apologies.  We slipped up and forgot to include the two figures that
go along with Greg Daich's article (November and December 1996).  Here
they are:


                            SOFTWARE TEST MANAGEMENT
                                      |
                                      |
     +--------------------------------+-----------------------+
     |                                                        |
     |                                                        |
REQUIREMENTS-BASED -------------------+------------------ CODE-BASED
   TESTING                            |                     TESTING
     |                                |                       |
Special Testing Requirements:         |            Special Testing Requirements:
Object-Oriented Systems               |               Data Flow Testing
Client/Server Systems                 |               Mutation Testing
Real-Time Systems                     |
Usability Testing                     |
Performance Testing, etc.             |
                                      |
                                      |
                                      v
                                   REGRESSION
                                    TESTING


                  FIGURE 1:  TEST IMPROVEMENT ROAD MAP



             Evolve                                        Enable
       Technologies  +<---------------------------------+  Improvements
                     |                                  |
                     |          ENSURE SUCCESS          |
                     v                                  |
                     +--------------------------------->+
          Establish               Examine                  Evaluate
              Needs               Practices                Alternatives



       FIGURE 2:  TRAVEL GUIDE FOR THE TEST IMPROVEMENT ROAD MAP

========================================================================

                 THREE ENGINEERS' APPROACH TO THE WORLD

          A mechanical engineer, an electrical engineer and a
software engineer are traveling in an old Fiat 500 (Bambino) when all of
the sudden the car backfires and comes to a abrupt halt.

o  The Mechanical Engineer says, "Aha! It's probably a problem with the
   valves or the pistons!".

o  The Electrical Engineer says, "Nonsense! It's most probably a problem
   with the spark plugs or the battery!".

o  The Software Engineer says, "Hey, it's simple!  How about we all get
   out of the car, and get back in again."

========================================================================

             EVALUATING TTN-ONLINE:  GIVE US YOUR COMMENTS

TTN-Online is free and aims to be of service to the larger software
quality and testing community.  To better our efforts we need YOUR
FEEDBACK!

Please take a minute and E-mail us your thoughts about TTN-Online?

Is there enough technical content?

Are there too many or too few paper calls and conference announcements?

Is there not enough current-events information? Too much?

What changes to TTN-Online would you like to see?

We thrive on feedback and appreciate any comments you have.  Simply
address your remarks by E-mail to "qw@soft.com".

========================================================================

          TTN Online Edition -- Mailing List Policy Statement

Some subscribers have asked us to prepare a short statement outlining
our policy on use of E-mail addresses of TTN-Online subscribers.  This
issue, and several other related issues about TTN-Online, are available
in our "Mailing List Policy" statement.  For a copy, send E-mail to
ttn@soft.com and include the word "policy" in the body of the E-mail.

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is Emailed the 15th of each month to subscribers
worldwide.  To have your event listed in an upcoming issue Email a
complete description and full details of your Call for Papers or Call
for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).  Longer articles are OK and may be serialized.
o  Length of submitted calendar items should not exceed 60 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters and TTN-Online disclaims any responsibility for their
content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-ONLINE, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, send E-mail to "ttn@soft.com".

TO SUBSCRIBE: Include in the body of your letter the phrase "subscribe
".

TO UNSUBSCRIBE: Include in the body of your letter the phrase
"unsubscribe ".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA

                   Phone:          +1 (415) 550-3020
                   Toll Free:      +1 (800) 942-SOFT (USA Only)
                   FAX:            +1 (415) 550-3030
                   E-mail:         ttn@soft.com
                   WWW URL:        http://www.soft.com


                               ## End ##