sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======             March  1997             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the worldwide software testing
community.

(c) Copyright 1997 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition
provided that the entire document/file is kept intact and this copyright
notice appears with it.

========================================================================

INSIDE THIS ISSUE:

   o  Ten Arguments Against ISO-9000, by John Seddon

   o  A Guided Tour of Quality Week '97

   o  Toward a Software Product Assessment: An Attack on the Broader
      Software Maintenance Problem, by Alan B. Salisbury, President,
      Learning Tree International

   o  Tri-Ada'97 Details Available: St. Louis, November 1997.

   o  WWW Item about Ariane 5 Explosion Available

   o  ISSRE'97 Reminder: 2-5 November 1997, Albuqueruqe, New Mexico

   o  Challenge by Elliot Chikofsky: A Technical Challenge to the
      Software Re-Engineering and Reverse Engineering Community

   o  COMPASS'97: Program Announcement

   o  Advance Program: Symposium on Software Reusability (SSR'97),

   o  Evaluating TTN-Online

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

                     Ten Argments Against ISO-9000!

                             by John Seddon

                   E-mail: john@vanguardconsult.co.uk

Note: John Seddon is a well known speaker and writer and researcher, and
he will be giving one of the keynote talks at QW'97 (see program below).
The issues surrounding ISO-9000 are *SO* sensitive that we thought it
would be a good idea to give you a "sneak preview" of what John's
Keynote Talk at QW'97 will be all about.

       o       o       o       o       o       o       o       o

John Seddon has ten arguments in his case against ISO 9000:

 1. ISO 9000 encourages organizations to act in ways which make things
    worse for their customers.

 2. Quality by inspection is not quality.

 3. ISO 9000 starts from the flawed presumption that work is best
    controlled by specifying and controlling procedures.

 4. The typical method of implementation is bound to cause sub-
    optimization of performance.

 5. The Standard relies too much on people's and particularly assessors'
    interpretation of quality.

 6. The Standard promotes, encourages, and explicitly demands actions
    which cause sub-optimization.

 7. When people are subjected to external controls, they will be
    inclined to pay attention to only those things which are affected by
    the controls.

 8. ISO 9000 has discouraged managers from learning about the theory of
    variation.

 9. ISO 9000 has failed to foster good customer-supplier relations.

10. As an intervention, ISO 9000 has not encouraged managers to think
    differently.

John maintains that if UK plc were a 'quality organization' and its
impact of this Standard  on performance we would 'stop production' now?
ISO 9000 is not 'fit for purpose'.  It is, on the contrary, getting in
the way!

According to John ISO 9000 is only a manifestation of a wider problem:
this century has seen the development of a way of thinking about running
organizations which we have come to call 'command and control'
management thinking.  It is, he says, the very thing which has prevented
quality getting on the agenda and yet it lurks within something claiming
to be a quality standard.

========================================================================

                  A GUIDED TOUR OF QUALITY WEEK '97...

             CONFERENCE THEME: "Quality in the Marketplace"

                         Sheraton Palace Hotel
                             27-30 May 1997
                     San Francisco, California  USA

      Complete information about QW'97 is available on the WWW at:

                      http://www.soft.com/QualWeek

As the 90's draw to a close we see more and more that software quality
issues are among the essential driving forces for selection, use, and
world-wide software market expansion.  Companies large and small,
Government units, in short, everyone, are concerned to make sure that
the software products they deliver provide good value and effective use.

QW'97 is a multi-threaded conference aimed at all levels of software
quality people, from those just beginning new projects, to those with
many years experience.  People come to Quality Week to get started, to
hone their skills, to share results with their colleagues ... a thousand
reasons.  QW'97 has something for everyone and is something that
everyone in the software quality will want to attend.

This short writeup, a companion to the printed brochure and the on-line
conference description on the WWW <http://www.soft.com/QualWeek>, gives
an informal summary of the entire conference.

Note that the QW'97 session numbers are given with each paper; for
example, "(TA)" means Tutorial A.

HALF-DAY TUTORIALS

The conference opens with 10 carefully selected speakers, with topics
intended to address the issues of today as well as the issues of
tomorrow.

Basics of testing are presented by the world-renowned test expert:
Boris Beizer (Independent Consultant), "An Overview of Testing (TA)",
likely to be, as it has in the past, one of the best general
introductions to the software quality area you'll likely discover.

In keeping with the growing interest in process oriented approaches to
quality there are tutorials by Matthew Cooke (European Software
Institute), "Introduction to SPICE (ISO 15504 Software Process
Assessment) (TH)" and by William J. Deibler & Bob Bamford (Software
Systems Quality Consulting), "Software Engineering Models for Quality:
Comparing the SEI Capability Maturity (CMM) to ISO 9001 (TC)".

Along the way, software designers need to know the intrinsic value of
inspection methods, so we have arranged to bring you:  Tom Gilb
(Independent Consultant), "Optimizing Software Inspections (TE)".

Technically oriented testing is key to good quality software, as
illustrated by Robert V. Binder (RBSC Corporation), "Test Automation for
Object oriented Systems (TB)".  And, continuing the theme of using the
best available technology, there is a generalized method described by
Michael Deck (Cleanroom Software Engineering, Inc.), "Cleanroom
Development: Formal Methods, Judiciously Applied (TJ)".

Real-world results have to be taken into account, too.  Good
specifications beget good software, says Bob Poston (Aonix), "10X
Testing: Automating Specification-Based Testing (TG)" and optimized
field testing using the concepts of execution profiles that he invented
are described by John Musa (Consultant), "Applying Operational Profiles
in Testing (TF)".

We all know that the Year 2000 problem may be a real watershed for
software quality, and this topic is the focus of the tutorial by
Nicholas Zvegintzov (Software Management Network), "Testing for Year
2000 (TI)".

Finally, judging from the recent activity and concern, in particular the
wave of concern brought about by the WWW and the Internet and all of its
issues, the legal aspects have to be important to us all.  Hence we have
a tutorial by Cem Kaner & Brian Lawrence & Tom Arnold (Falk & Nguyen),
"Software Quality-Related Law (TD)".

KEYNOTE TALKS

Special events like QW'97 give us an opportunity to address critical
issues, controversial issues and issues that are likely to be important
in the future.

In view of the interest in ISO-9000 you'll be pleased to know that not
everyone believes ISO-9000 to be a panacea.  This keynote by John Seddon
(Consultant), "In Pursuit of Quality: The Case Against ISO 9000 (1P1)"
aims to clarify the ISO-9000 effectiveness question.

Software *IS* software and we don't really know enough about it to
manage it well.  Two keynotes take different looks at the question of
where and how should software be evolved:  Lawrence Bernstein (National
Software Council), "Software Dynamics: Planning for the Next Century
(5P1)", and, Dick Hamlet (Portland State University), "Keeping The
"Engineering" In Software Engineering (5P2)".

We all know that a main concern for us all is the stability and security
of the Internet.  Two highly respected keynoters describe the problem
and suggest ways to gain confidence:  Dorothy Denning "Cyberspace
Attacks and Countermeasures: How Secure IS the Internet? (1P2)" and,
Lori A. Clarke (University of Massachusetts), "Gaining Confidence in
Distributed Systems (10P1)".

Finally, employing his well-known style of truth-telling, there will be
another critical look at the Year 2000 question:  Boris Beizer
(Independent Consultant), "Y2K Testing: The Truth of the Matter (10P2)".

QUICK-START MINI-TUTORIAL TRACK

The QuickStart Mini-Tutorials are designed to give the attendee an in-
depth look at an important area of software quality, and also to assist
in passing on the advice of experienced software quality people to those
new in the field.

The first two Mini-Tutorials in our QuickStart track focus on how best
to organize your activities to get the most results for the least work.
Tom Drake (NSA Software Engineering Center), "Managing Software Quality
-- How to Avoid Disaster and Achieve Success (2Q)" explains how it's
done in a large scale Government laboratory, and, Craig Kaplan (I.Q.
Company), "Secrets of Software Quality (9Q)" addresses the issues from
the point of view of the smaller programming shop.

The key to success in testing is to be sharp, quick, detail oriented ...
all attributes that describe the speaker in this tutorial:  James Bach
(STL), "How to be an Expert Tester (4Q)".

The best testing teams are the best managed ones, and Brian Marick
(Testing Foundations), "The Test Manager at The Project Status Meeting
(6Q)" focuses attention on the role that the test manager plays.

As testing technology matures, there is increased emphasis on the use of
technical methods of test completeness checking, the topic of Robert M.
Schultz (Motorola), "Test Coverage Analysis (7Q)">

Lastly, we have the issue of getting right the first time and making
sure that business issues are handled well.  For this we have:  Tom Gilb
(Independent Consultant), "Making Contracts Testable (3Q)".

SPECIAL PANEL SESSION

Is Java really going to take over the world?  Has anyone *NOT* heard
that Java is our future common programming language?  This panel
session, "How Does Java Address Software Quality: A Special Panel
Session (8Q)" brings industry experts together in a forum where you'll
hear the good news and the bad news, and maybe even some news that is
thought provoking as well!

TECHNOLOGY TRACK

The long term growth of software quality is founded in developing
technologies that address current and even future problems.  The
Technology Track has some 15 papers from around the world that show off
the latest thinking from labs and from the field and which gives you a
good cross-section of what people are doing today that can lead to
exciting results in the future.

Real time systems are extremely difficult to test, and these two papers
take a good hard look at the issues.  Victor Braberman & Martina Marre &
Miguel Felder (Universidad de Buenos Aires), "Testing Timing Behaviors
of Real Time Software (4T1)" and Joachim Wegener & Matthias Grochtmann
(Diamler-Benz AG), "Testing Temporal Correctness of Real-Time Systems by
Means of Genetic Algorithms (4T2)" give an additional international
perspective to testing these critical types of software.

Detailed program analysis is a key technology, and Antonia Bertolino &
Martina Marre (IEI-CNR), "A General Path Generation Algorithm for
Coverage Testing (2T1)", James R. Lyle & Dolores R. Wallace (National
Institute of Standards and Technology), "Using the Unravel Program
Slicing Tool to Evaluate High Integrity Software (3T1)" William Howden
(University of San Diego), "Partial Statistical Test Coverage and
Abstract Testing (7T2)" and Hugh McGuire (University of California,
Santa Barbara), "Generating Trace Checkers for Test Oracles (3T2)" look
into three types of detailed in-the-source-code analysis.

Object oriented approaches have high payoffs, provided that you "do
things right".  The paper by Daniel Jackson (Carnegie Mellon
University), "Automatic Analysis of Object Models (6T2)" examines how
best to automated object oriented testing, and the paper by Lee J. White
& Khalil Abdullah (Case Western Reserve University), "A Firewall
Approach for the Regression Testing of Object-Oriented Software (6T1)"
tries out a new concept.

Test support systems are often key, as Huey Der-Chu & John E. Dobson
(University of Newcastle upon Tyne), "An Integrated Test Environment for
Distributed Applications (8T1)" and Dolores R. Wallace & Herbert Hecht
(National Institute Of Standards & Technology), "Error Fault and Failure
Data Collection and Analysis (7T1)" make clear.

A hot new area -- mutation testing is the older name, genetic algorithms
is the newer name -- has generated a lot of excitement.  Istvan Forgacs
& Eva Takacs (Computer and Automation Institute), "Mutation-based
regression testing (9T2)", and Marc Roper (Strathclyde University),
"Computer Aided Software Testing Using Genetic Algorithms (9T1)" give
two different views of what can be expected.

A novel approach to code inspection, using machine assistance (the
prototype product is called "ASSIST"), is described in the paper by
Fraser Macdonald & James Miller (University of Strathclyde Department of
Computer Science), "Automated Generic Support for Software Inspection
(8T2)"

Finally, the world's expert on operation profiles describes practical
methods for influencing your testing in a reliability enhancing way:
John Musa (Consultant), "Applying Operational Profiles in Software-
Reliability-Engineered Testing (2T2)"

APPLICATIONS TRACK

How modern methods are applied makes a big difference in how successful
a software quality enhancement project is going to be.

To start with, your approach has to be systematic and thorough.  These
two papers speak to the question of how to be "more systematic" and
"more rigorous":  Larry Apfelbaum (Teradyne Software & Systems Test),
"Model Based Testing (8A2)", and, Albrecht Zeh & Paul Bininda (Sekas
GmbH), "Quality Improvement by Automatic Generation and Management of
Test Cases (9A2)".

Quality on the WWW is critical these days, and the quality issues
affecting, and affected by, all manner of WWW-based systems is becoming
increasingly important.  The papers by Peter Middleton & Colm Dougan
(Queens University Belfast), "Grey Box Testing C++ via the Internet
(7A1)", and, Shankar L. Chakrabarti & Harry Robinson (Hewlett Packard
Company), "Catching Bugs in the Web: Using the World Wide Web to Detect
Software Localization Defects (7A2)", deal specifically with WWW based
issues.  Meanwhile, taking their cue from other aspects of systematic
test automation, the papers by Debrorah MacCallum (Bay Networks), "Test
Automation Solutions for Complex Internetworking Products (9A1)", Ram
Chillarege (IBM), "Orthogonal Defect Classification (4A2)", and, Taghi
Khoshgoftaar (Florida Atlantic University), "Identifying Fault-Prone
Modules: A Case Study (4A1)" look in detail into the question of how
best to define, track, and contend with software trouble reports (i.e.
defects).

Quality issues arise in the worlds space programs as well, as the papers
by Francesco Piazza (Alenia Spazio), "A Requirement Traceability
Application for Space Systems ", and, Jennifer Davis & Daniel Ziskin &
Bryan Zhou (NASA Goddard Space Flight Center), "The Challenge of Testing
Innovative Science Software on a Rapidly Evolving Production Platform
(2A1)" make clear.

Putting things into practice, sometimes an experience that differs a lot
from the expectations, is addressed by two speakers:  Otto Vinter (Bruel
& Kjaer), "How to Apply Static and Dynamic Analysis in Practice (6A1)",
and, Danny R. Faught (Hewlett Packard Company), "Experiences with OS
Reliability Testing on the Exemplar System (3A2)".

Lastly, how tool environments affect deliverying software quality is
addressed by:  Joe Maybee (Tektronix Inc.), "Mother2 and MOSS: Automated
Test Generation from Real-Time Requirements (6A2)", Rob Oshana (Texas
Instruments), "Improving a System Regression Test With a Statistical
Approach Implementing Usage Models Developed Using Field Collected Data
(8A1)", and, Michael E. Peters (Digital Equipment Corporation),
"Managing Test Automation: Reigning in the Chaos of a Multiple Platform
Test Environment (3A1)".

MANAGEMENT TRACK

Process issues are sometimes as important as the product itself -- after
all, if the process is good then the product OUGHT to be good too!  But
process involves more than just complying with a standard.  Ana Andres
(Eurpoean Software Institute), "ISO-9000 Certification as a Business
Driver: The SPICE (4M2)" Marilyn Bush (Xerox Corporation), "What is an
SEPG's Role in the CMM? (7M2)" Paul Taylor (Fujitsu Software
Corporation), "Workware & ISO 9000 (4M1)" Johanna Rothman (Rothman
Consulting Group), "Is your Investment in Quality and Process
Improvement Paying Off?? (7M1)"

Allied to this is the issue of process measurement, addressed well in
the paper by Don O'Neill (Independent Consultant), "National Software
Quality Experiment, A Lesson in Measurement (3M2)".

Getting the most out of your test group is a key area, too.  Here are
four papers that give you good tips on how to maximize the effectiveness
of your test group:  Nick Borelli (Microsoft), "Tuning your Test Group:
Some Tips & Tricks (9M1)", Dave Duchesneau (The Boeing Company),
"Guerrilla SQA (9M2)", Jerry E. Durant (Certifiable Technologies, Ltd.),
"Implementing a Successful Automated Test Tool Selection Process (2M1)",
and, Andrew A. Gerb (Space Telescope Science Institute), "Delivering
Quality Software in Twenty-Four Hours (6M1)".

Many quality engineers have to deal with so-called "legacy systems", and
the concepts that apply in this case may be subtly different from those
that apply to "new code".  The papers by John Hedstrom & Dennis J.
Frailey (Texas Instruments), "Mastering the Hidden Cost of Software
Rework (2M2)", and, Lech Krzanik & Jouni Simila (CCC Software
Professionals Oy), "Incremental Software Process Improvement Under
"Smallest Useful Deliverable" (3M1)" take a fresh, forward-looking peek
at what really is involved in this critical area.

We learn best from experience, and the conference would not be complete
without some case studies.  The situations presented here give a cross-
section of the issues:  Brian G. Hermann & Amritt Goel (U.S. Air Force),
"Software Maturity Evaluation: Can We Predict When Software Will be
Ready for Fielding? (6M2)", G Thomas (Vienna University of Technology),
"On Quality Improvement in "Heroic" Projects (8M1)", and, Ian Wells
(Hewlett Packard), "Hewlett Packard Fortran 90 compiler project case
study (8M2)".

                       REGISTRATION INFORMATION

Complete registration with full information about the conference is
available on the WWW at

                     <http://www.soft.com/QualWeek>

where you can register on-line.

We will be pleased to send you a QW'97 registration package by E-mail,
postal mail or FAX on request.  Send your E-mail requests to:

                              qw@soft.com

or FAX or phone your request to SR/Institute at the numbers below.

         QW'97: 27-30 May 1997, San Francisco, California  USA

+-----------------------------------+----------------------------------+
| Quality Week '97 Registration     | Phone:       [+1] (415) 550-3020 |
| SR/Institute, Inc.                | Toll Free:        1-800-942-SOFT |
| 901 Minnesota Street                  | FAX:         [+1] (415) 550-3030 |
| San Francisco, CA 94107 USA USA  | E-Mail:              qw@soft.com |
|                                   | WWW:         http://www.soft.com |
+-----------------------------------+----------------------------------+

========================================================================

                 Toward a Software Product Assessment:
         An Attack on the Broader Software Maintenance Problem

                           Alan B. Salisbury,
                      Learning Tree International

ABSTRACT:  Although a great deal of attention has been focused on
assessments, tools, and methods to improve the software development
process, software maintenance (where upward of 70 percent of lifecycle
costs are typically incurred) has received relatively little attention.
This article proposes a Software Product Assessment, loosely modeled
after the Software Process Assessment, as a front-end tool to help
managers allocate resources targeted at reducing future maintenance
costs.

It is a well-accepted fact that the majority of the costs of a software
system fall into the maintenance portion of the software product
lifecycle.  Estimates have ranged as high as 80 percent of total
lifecycle costs consumed in maintenance, with total annual software
maintenance costs in the United States reaching the $30 billion range.
Numbers like these merit the attention of managers at all levels, from
information systems organizations to chief executive officers, and
policy makers in government and industry.

In recent years, a great deal of focus and attention has continued to be
given to the "software crisis," especially within the government and
most particularly within the Department of Defense (DoD).  The
establishment (by DoD) of the Software Engineering Institute (SEI) at
Carnegie Mellon University and (by aerospace and defense industry) the
Software Productivity Consortium (SPC) are but two notable examples of
actions taken in response to the problem.

Both the SEI and the SPC have largely concentrated their efforts, with
some significant success, on the development portion of the lifecycle.
As the "P" in SPC indicates, much of the effort focuses on productivity
to reduce the time (and hence cost) required to develop new software.
Related efforts have contributed directly and indirectly to software
quality as well.  Producing a better product at a lower cost and on a
shorter schedule promises to yield tremendous benefit to government and
industry alike.

Important as they are, these efforts are aimed primarily at improving
the software development process, which historically accounts for only
20 percent to 30 percent of the lifecycle costs of a software product.
Improvements in development cost, schedule, and quality will not in and
of themselves eliminate the software crisis.  Not coincidentally,
however, it is the development of new systems that captures management's
(and critic's) attention in the highly visible world of procurements,
budgets, and program management.

The Software Maintenance "Bow Wave" and "Stern Wave"

While both government and industry continue to develop software
applications at a high rate, they are creating a "bow wave" of new
systems that, when fielded, will join an already crowded competition for
maintenance resources.  Moreover, large enterprises, especially those
that have been using automated systems for 20 years or more, typically
have an enormous "stern wave" of existing systems, the maintenance of
which consumes an equally enormous quantity of resources while remaining
absolutely critical to the day-to-day functioning of the enterprise.

Chief information officers today, whether in industry or government, are
under pressure to develop (or acquire) vitally needed new systems to
drive the enterprise.  At the same time, they must maintain a vast array
of "legacy" systems, which are equally vital to the operation and
continued existence of the enterprise.  This puts even more pressure on
what is invariably limited available resources.

Year 2000 Complications

The so-called year 2000 (Y2K) problem has suddenly rocketed to the top
of information technology (IT) management's list of concerns and
priorities.  The reason for this phenomenon is that this problem
threatens to bring many systems (if not entire companies and government
organizations) to a screeching halt no later than Jan. 1, 2000 as these
systems incorrectly react to "00" date fields.  Many systems already are
realizing problems as they encounter future dates (such as credit card
expirations) not anticipated when the software was originally designed.
Such "latent deficiencies" have, in many cases, lain dormant for as many
as 30 years or more, depending on the vintage of these legacy systems.

Fixes to this problem are generally being addressed on an emergency (if
not panic) basis as the magnitude of the problem is being recognized and
understood.  Estimates of the cost to fix this problem alone are
typically in the dollar-plus per line-of-code range, with larger
organizations looking at bills in the tens to hundreds of millions of
dollars.  It is not hard to find total impact estimates ranging in the
tens of billions of dollars nationwide, rivaling the total annual
software maintenance figure.  Not surprisingly, an entire subindustry of
IT solution providers is rapidly emerging, specializing in Y2K
solutions.

This potentially staggering unforecasted and unfunded bill has resulted
in many ongoing projects and "routine" maintenance efforts being swept
aside and put on indefinite hold until the Y2K problem is solved and
fixes implemented.  A typical approach is to review all current systems
to determine which should be scrapped and replaced, which should be
rewritten from scratch, and which should be "repaired" through a range
of reengineering and code- patching alternatives.  This highly focused
assessment process is not only an essential front end to the Y2K remedy,
but also represents a tremendous opportunity as part of a much broader
attack on the larger software maintenance problem.

The Software Process Assessment

Perhaps the most significant accomplishment of the SEI to date has been
the development (and increasingly widespread acceptance) of the Software
Process Assessment, led by Watts Humphrey [1].  The process assessment
is keyed to Humphrey's "Process Maturity Framework," which defines five
levels of maturity for software development organizations:  Initial,
Repeatable, Defined, Managed, and Optimizing, (Levels 1 through 5,
respectively).  The process assessment, conducted by a trained team of
software professionals, includes the completion of questionnaires
regarding representative projects, extensive interviews with software
functional representatives, and follow-on discussions with project
personnel and management.  The final report of a Software Process
Assessment identifies where the organization and its process falls on
the maturity level framework and provides prioritized recommendations
that can serve as the basis for an action plan for improvement.  Thus,
the Software Process Assessment serves as a superb management tool to
not only establish current status of a software development
organization, but also provide a foundation for manage- ment planning
and resource allocation.

About the Author

Alan Salisbury is president of Learning Tree International, an
independent professional IT training organization.  As a major general,
he commanded the Army's Information Systems Engineering Command, which
included responsibility to develop and maintain all Army-wide management
information systems.

Since his retirement from the Army, he has also been president of Contel
Technology Center, the research and development arm of the former Contel
Corp., and chief operating officer of the Microelectronics and Computer
Technology Corp. (MCC), a research and development consortium.  He has a
master's degree in electrical engineering and holds a doctorate in
electrical engineering and computer science, both from Stanford
University.

Learning Tree International
1831 Michael Faraday Drive
Reston, Va 20190
Voice: 703-709-5979
Fax: 703-471-4732
E-mail: asalisbury@learningtree.com
WWW: http://www.learningtree.com

Editors Note: This article originally appeared in CrossTalk, published
by the USAF Software Technology Center, Hill AFB, Utah.

========================================================================

                 TRI-ADA '97 CONFERENCE DATA AVAILABLE

The Tri-Ada '97 URL is now available!  Check out:

                   http://www.acm.org/sigada/tri-ada/

The Call for Participation for TA'97 on the street as well.  Check out:

              http://apci.net/~dfh/CFP.htm for a glimpse)

David F. Harrison       Tri-Ada'97 Conference Chair
(618)  624-0852 (R)     (618) 624-5140 (FAX)    (618) 256-1920 (W)

harrisdf@HQAMC.SAFB.af.mil  dharrison@acm.org   d.f.harrison@ieee.org
dfh@apci.net

http://www.apci.net/~dfh/ (See also http://www.acm.org/sigada/tri-ada/)

========================================================================

              WWW ITEM ABOUT ARIANE 5 EXPLOSION AVAILABLE

Note:  This fragment was included in a recent E-mailing of Peter
Neumann's excellent RISKS forum and we thought it might be of interest
to TTN-Online Readers.

Date: Tue, 11 Mar 1997 11:43:38 +0200
From: "Robert L. Baber" 
Subject: The Ariane 5 explosion: a software engineer's view...

My web page "The Ariane 5 explosion as seen by a software engineer"

               http://www.cs.wits.ac.za/~bob/ariane5.htm

shows how the software anomaly that caused the destruction of the Ariane
5 and its payload (a DM 1200 million loss) could have been avoided by a
simple application of correctness-proof techniques.  It also highlights
the importance of strict preconditions and the inadequacy of ordinary
preconditions for practical applications.

Prof. Robert L. Baber, Computer Science Dept, University of the
Witwatersrand, Johannesburg, 2050 Wits, South Africa Phone: +27-11-716-
3794  E-mail: bob@cs.wits.ac.za


========================================================================

  ___  ____  ____  ____  ____ \ ____  ____
  //  //__  //__  //_// //_     //_//    //     November 2 - 5, 1997
_//_ ___// ___// // \ //__    ___//    //   Albuquerque, New Mexico USA

 ISSRE'97, The 8th Int'l Symposium on Software Reliability Engineering

For more information on ISSRE'97, see <http://admin.one2one.com/issre97>
or drop a line to . Note, Metrics97 (see
<http://www.cs.pdx.edu/conferences/metrics97/> will immediately follow
ISSRE'97 in the same hotel.

========================================================================

                     CHALLENGE BY ELLIOT CHIKOFSKY

                 A technical challenge to the software
            reengineering and reverse engineering community.

At the 3rd Working Conference on Reverse Engineering (WCRE) last
November in Monterey, I proposed a challenge to the reverse engineering
and reengineering community:  to show what each of our tools and methods
can say about one common example.  The effort would be an organized
project in which everyone capable would demonstrate the results of their
own products and research tools, automated or manual.  The results would
be published as a series of side-by-side papers to enable us to get a
good look across the industry at the state-of-the-art, as it can be
applied to a selected problem.

I am pleased to introduce the Reverse Engineering Demonstration Project:
an international cooperative study among commercial and non-commercial
research groups and project organizations.  All manner of reengineering
and reverse engineering methods and tools, both automated and manual,
are welcome.

The subject software of this demonstration project will be a system of
programs used for election tabulation that originated in the 70s and has
been enhanced/maintained/etc. across multiple languages and machines,
now in C under MSDOS on PC.

Why do I call it a "demonstration project"?  The goal is to demonstrate
what our tools and methods are capable of.  This is not intended as a
product comparison, and the project will not be publishing comparative
analysis nor judgement about approaches against one another.  Each tool
or method is being applied by its own advocate to the common software
example to show the breadth and depth of what can be learned about the
subject software.

A detailed description of the Reverse Engineering Demonstration Project
can be found at:

                  http://www.worldpath.com/reproject/

Questions or requests for a copy of the project brochure (same info.  as
on the web site) should be directed to:

                        reproject@worldpath.com

I hope that you will consider joining us in this joint project to
advance software reengineering and reverse engineering.  Please pass
word of this project along to others who may be interested in
participating.

Elliot Chikofsky
Burlington, MA, USA
12 March 1997
E-Mail: e.chikofsky@computer.org

========================================================================

                              COMPASS '97
              12th Annual Conference on Computer Assurance
                            June 16-19, 1997
                           Gaithersburg,  MD

                                WEB SITE
                  http://hissa.ncsl.nist.gov/compass/

                              Sponsored by
                       IEEE National Capital Area
                  IEEE Aerospace & Electronic Society

COMPASS (COMPuter ASSurance) is an annual conference held in the
Washington, D.C. area with the purpose of bringing together researchers,
developers, integrators, and evaluators interested in problems related
to specifying, building, and certifying high-assurance systems.  What
distinguishes COMPASS is its emphasis on bridging the gap between theory
and practice.  The theme of COMPASS focus discussion on whether the
approaches developed and reported during the past twenty-five years have
any hope for solving today's assurance problems.  In addition to
exploring technical strengths and weaknesses in the state-of-the-art and
state-of-the-practice, conference goals include: identifying barriers to
applying existing assurance technologies in industry, understanding what
properties new technologies must have to meet industrial needs, and
identifying advanced technologies that are effective in attacking the
key problem areas of safety, security, fault-tolerance, and real-time
control.

For researchers, COMPASS '97 provides an opportunity to present new
theories, techniques, methods, or results of case studies to other
researchers and practitioners who can put them to use.  COMPASS '97 also
provides a unique opportunity for participants to learn from
practitioners about issues and problems encountered in constructing
practical systems.  This mix of cutting-edge research and practical
real-world experience is unique among software conferences.

========================================================================

                            Advance Program

           1997  SYMPOSIUM ON SOFTWARE REUSABILITY  (SSR'97)

                        Sponsored by ACM SIGSOFT
                   Back Bay Hilton, 40 Dalton Street
             Boston, Massachussetts, USA --  17-20 May 1997
                   URL: http://www.owego.com/~ssr97/

Co-Located with the 1997 International Conference on Software
Engineering, ICSE-97, 18-23 May 1997.

The Symposium on Software Reusability is ACM's bi-annual forum, held in
conjunction with the International Conference on Software Engineering
(ICSE), for the exchange of ideas, research and development results and
experiences in all aspects of software reusability.  SSR'97 invites you
to participate in tutorials, keynotes, panels, and all aspects of the
technical program.

TO REGISTER:  Use the ICSE registration form or register on-site at the
main ICSE registration desk at the Sheraton (across the street from the
Back Bay Hilton Hotel).

           -----        Keynote Presentations       -----

                "Theory and practice of adaptive reuse"
                           Dr. Paul Bassett
                           Cap-Netron, Inc.

                 "Patterns, Frameworks, and Components"
                           Dr. Ralph Johnson
              University of Illinois at Urbana-Champaign

           -----              TUTORIALS                 -----

SSR'97 has an outstanding tutorial program running on Saturday, May
17. Tutorial topics include some of the hottest technical and
management issues in software development and reusability.

            SSR`97 Tutorial Offerings Saturday May 17, 1997

                   Morning Sessions 9:00am-12:30pm
  Session
   Number        Presenter(s)                    Title

    T1      Prem Devanbu and      Generative Reuse: A Survey of Tools
            Bill Frakes           and Processes

    T2      Gregor Kiczales and   Designing High-Performance Reusable
            Chris Maeda           Code

    T3      Carma McClure         Extending the Software Process to
                                  Include Reuse

    T4      Barry Keepence and    Requirements-driven Software
            Mike Mannion          Reusability

                  Afternoon Sessions 2:00pm - 5:30pm
  Session
   Number        Presenter(s)                    Title

    T5      Don Batory            Software system generators,
                                  Architectures, and Reuse

    T6      Joe Hollingsworth     Design Dilemmas that impede
            & Bruce Weide         construction of high-quality
                                  components

    T7      Jeff Poulin           Software Reuse Metrics, Reusability
                                  Metrics, and Economic Models

    T8      Ernesto Guerrieri     Software Reuse with Java

            -----            Technical Program           -----
The technical program consists of paper presentations and panels on
current topics in software reusability. Paper sessions include the
latest developments in software architecture, domain analysis and
engineering, object-oriented reuse, reuse on the Internet, and
application generators & program transformation.

Technical papers are scheduled in five sessions 18-20 May. These fall on
Sunday, Monday, and one joint session with ICSE on Tuesday.  The panels
provide a chance for lively interaction with experts in the field. The
first panel will explore the dynamic area of internet-based reuse in a
discussion titled "The Impact of Java on Software Reusability." The
second, titled "Reuse R&D: Is it on the Right Track?," will take a look
at future directions and discuss where we go from here.  The technical
program includes:

Sunday, 18 May 1997

   * Keynote Talk I: "Theory and practice of adaptive reuse"
     (9:15-10:15 am)
   * Paper Session 1: Software Architecture, Systemic Reuse &
     Component-based Systems (10:40-12:20 pm)
   * Paper Session 2: Domain Analysis and Engineering (1:30 - 3:30 p.m.)
   * Panel: The Impact of Java on Software Reusability (4:00-5:15 pm)

Monday, 19 May 1997

   * Keynote Talk II: Patterns, Frameworks and Components (9:00-10:15
am)
   * Paper Session 3: Object-Oriented Reuse and Reuse on the Internet
     (10:40 - 12:20 pm)
   * Paper Session 4: Application Generators & Program Transformation
     (1:30 - 3:30 pm)
   * Panel: Reuse R&D: Is it on the Right Track? (4:00-5:15 pm)

Tuesday, 20 May 1997

   * Software Reuse: Joint Session with ICSE'97 (10:30-12:00 pm)

ORGANIZING COMMITTEE

 General Chair                          Program Chair

 Guillermo Arango                       Mehdi T. Harandi
 Schlumberger WTH Information Tech.     University of Illinois
 50, Av. Jean Jaures, Bat. H            Department of Computer Science
 92541 Montrouge Cedex, France          1304 W. Springfield Ave.
                                        Urbana, IL 61801, U.S.A.
 Phone +33 (0)1 49 65 59 46             Phone 217-333-4865
 Fax   +33 (0)1 49 65 59 50             Fax   217-244-6869
 arango@montrouge.wireline.slb.com      harandi@cs.uiuc.edu

Program Committee:

  Tsuneo Ajisaka (Japan)               Don Batory (USA)
  Sanjay Bhansali (USA)                James Bieman (USA)
  Silvana Castano (Italy)              Thiel Chang (Netherlands)
  Betty Cheng (USA)                    Reidar Conradi (Norway)
  Maggie Davis (USA)                   W.R. Edwards (USA)
  Harold Gall (Austria)                M. Rosario Girardi (Uruguay)
  Ernesto Guerrieri (USA)              Sadahiro Isoda (Japan)
  Mehdi Jazayeri (Austria)             Even-Andre Karlsson (Sweden)
  Robert Kessler (USA)                 Rene Kloesch (Austria)
  Sadie Legard (UK)                    Ali Mili (Canada)
  Roland Mittermeir (Austria)          Jean-Marc Morel (France)
  Jeffrey Poulin (USA)                 Ruben Prieto-Diaz (USA)
  Guttorm Sindre (Norway)              Murali Sitaraman (USA)
  Joseph Urban (USA)                   Michael Wasmund (Germany)
  R. Alan Whitehurst (USA)

========================================================================

             EVALUATING TTN-ONLINE:  GIVE US YOUR COMMENTS

TTN-Online is free and aims to be of service to the larger software
quality and testing community.  To better our efforts we need YOUR
FEEDBACK!

Please take a minute and E-mail us your thoughts about TTN-Online?

Is there enough technical content?

Are there too many or too few paper calls and conference announcements?

Is there not enough current-events information? Too much?

What changes to TTN-Online would you like to see?

We thrive on feedback and appreciate any comments you have.  Simply
address your remarks by E-mail to "qw@soft.com".

========================================================================

CORRECTION: Yes, that's right.  We asked for comments about TTN to be
E-mailed to "qw@soft.com".  Well, it should have been "ttn@soft.com",
but be assured we got all of the comments and suggestions.  In the
future, please send comments direct to "ttn@soft.com".

========================================================================

          TTN Online Edition -- Mailing List Policy Statement

Some subscribers have asked us to prepare a short statement outlining
our policy on use of E-mail addresses of TTN-Online subscribers.  This
issue, and several other related issues about TTN-Online, are available
in our "Mailing List Policy" statement.  For a copy, send E-mail to
ttn@soft.com and include the word "policy" in the body of the E-mail.

========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is E-mailed the 15th of each month to
subscribers worldwide.  To have your event listed in an upcoming issue
E-mail a complete description and full details of your Call for Papers
or Call for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).  Longer articles are OK and may be serialized.
o  Length of submitted calendar items should not exceed 60 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters and TTN-Online disclaims any responsibility for their
content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-ONLINE, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, send E-mail to "ttn@soft.com".

TO SUBSCRIBE: Include in the body of your letter the phrase "subscribe
".

TO UNSUBSCRIBE: Include in the body of your letter the phrase
"unsubscribe ".

                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA

                   Phone:          +1 (415) 550-3020
                   Toll Free:      +1 (800) 942-SOFT (USA Only)
                   FAX:            +1 (415) 550-3030
                   E-mail:         ttn@soft.com
                   WWW URL:        http://www.soft.com

                               ## End ##