xxxxxxx x
xxxxxxxxxxxx xx
xxxxxxxxxxxx xxxx
xxxxxxxxx xxxxx
xxxxxxxx xxxxx
xxxxxxxx xxxxx xxx x
xxxxxx xxxxx xxxxxxx xx
xx xxx xxxxxxxxx xxx
x xxxxxxxxxx xxxx
xxxxxx xxxxx
xx xxxxxx xxxxxx
xx xxx xxxxxx xxxxxxx
xxx xxxxx xxxxxx xxxxxxx
xxxx xxxxxx xxxxxx xxxxxxx
xxxx xxxxxxx xxxxxxx
xxx xxxxxxxx xxxxxxx
xxx xx xxxxxxxx xxxxxxx
xxx xxxxxxxx xxxxxxx
xx xxxxxxxx xxxxxxxxx
xx xxxxxxxxx xxxxxxx
xxxxxxxxx xx xxxxx
xxxxxxx xxxxxxxxx xxx xxxxx
xxxxxxxxxxxxxxxxxxxxxxxx xxxx xxxx
xxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxx
xxxxxxxxxxxxxxxxxxxx xxxxxx xxxx
xx xxxxxxxx xxxxxx xxxx
xxxxxx xxxxx
xxxxxx xxxxx
xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxx
+===================================================+
+===================================================+
+======= Testing Techniques Newsletter (TTN) =======+
+======= ON-LINE EDITION =======+
+======= July 1994 =======+
+===================================================+
+===================================================+
TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
to provide information of general use to the software testing community.
(c) Copyright 1994 by Software Research, Inc. Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.
TRADEMARKS: Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc. All other systems are either
trademarks or registered trademarks of their respective companies.
========================================================================
INSIDE THIS ISSUE:
o Featured Conference: 4th Reengineering Forum
o Featured Conference: 11th Conference on Software Maintenance
o B. Marick: Possible Testing Procedures for OO Developed Systems
o Software Quality Control Journal -- Announcement and Call
o 3rd Conference on Software Process -- Preliminary Program
o ISSTA '94 -- Conference Summary
o Calendar of Events
o TTN Submittal Policy
o TTN Subscription Information
========================================================================
F E A T U R E D C O N F E R E N C E . . .
4TH REENGINEERING FORUM
"Reengineering in Practice"
September 19-21, 1994
Victoria Conference Centre
Victoria, B.C., Canada
The Reengineering Forum (REF) is a combined industry/research review of
the state of the art and the state of the practice. Presentations by
experienced users, leading consultants, developers and researchers
describe the utility and directions of techniques, approaches, products,
and prototypes. It is a meeting place for key people on all aspects of
reengineering:
-- reengineering of business processes,
-- IT transformation to better support the enterprise,
-- reengineering of software, systems, and data,
-- reverse engineering technology.
REF's objective is to form a broad perspective snapshot of the present
state of the field. A major focus of the Forum is to allow ample oppor-
tunity for one-on-one and group discussion among presenters and atten-
dees. Interaction is paramount. The Forum is held as much to be a meet-
ing place for the speakers to compare notes on the future of the field
as it is for attendees to learn more about the state-of-the-art and new
developments.
This year's forum will be the first major conference at which the busi-
ness process reengineering community and the software reengineering com-
munity will meet together to discuss the interaction between these driv-
ing industrial forces.
Also this year, the Reengineering Forum will join with the 11th Interna-
tional Conference on Software Maintenance (ICSM-94) to form "Reengineer-
ing and Maintenance Week", including jointly sponsored tutorials, a sin-
gle tools fair spanning both conferences, and a joint plenary session.
This brings together industry and research communities with similar
interests throughout the week of September 19-23.
September 19 - Tutorials before the Reengineering Forum
September 20 & 21 - Reengineering Forum sessions
September 20,21,22 - Tools Fair
September 22 & 23 - Intl. Conf on Software Maintenance sessions
General Chair: Elliot Chikofsky - DMR Group
Business Process Reengineering: Jon Clark - Colorado State University
Software and Systems Reengineering: James Cross - Auburn University
Exhibits Coordinator: Judith Golub - Software Management News
Tutorials Chair: Shawn Bohner - The Mitre Corporation
The Reengineering Forum is a U.S. 501(c)(6) non-profit industrial
conference. REF 94 is in cooperation with:
-- Software Engineering Institute (SEI)
-- DMR Group
-- USAF Software Technology Support Center (STSC)
-- International Workshop on CASE (IWCASE)
-- Software Management Association (SMA)
-- Software Maintenance News (SMN)
For further information on the 4th Reengineering Forum, contact:
Reengineering Forum phone +1-617-487-9070
c/o DMR Group Inc. fax +1-617-487-6752
404 Wyman Street, Suite 450
Waltham, MA 02154 USA email reengineer@computer.org
========================================================================
ISSTA `94 Program and Information
International Symposium on Software Testing and Analysis
Sponsored by ACM SIGSOFT
Seattle, Washington
August 17-19, 1994
GENERAL INFORMATION
The International Symposium on Software Testing and Analysis is the
latest in a series of leading-edge research conferences that began in
1978 (under the name Testing, Analysis, and Verification (TAV) in the
late 1980s). ISSTA provides a forum for the latest results on program
testing and analysis from academic and industrial research organiza-
tions, and it assures a meeting place for practitioners and researchers
working in this area.
Location: The Symposium will be held at the Edgewater Inn on the water-
front in Seattle, Washington, USA. Attendance is limited to 125.
PAPERS TO BE PRESENTED
"Automatic Verification of Requirements Implementation," Marsha Chechik,
John Gannon (University of Maryland, College Park)
"Aslantest: A Symbolic Execution Tool for Testing Aslan Formal Specifi-
cations." Jeffrey Douglas, Richard A. Kemmerer (University of Califor-
nia, Santa Barbara)
"An Automated Tool for Analyzing Completeness of Equational Specifica-
tions," Deepak Kapur (SUNY Albany)
"Generating Test Suites for Software Load Testing," Alberto Avritzer
(AT&T Bell Laboratories), Elaine Weyuker (AT&T Bell Laboratories and New
York University)
"Generating a Test Oracle from Program Documentation," Dennis Peters,
David L. Parnas (McMaster University)
"Forward Computation of Dynamic Program Slices," Bogdan Korel, Satish
Yalamanchili (Wayne State University)
"Applications of Feasible Path Analysis to Program Testing," Allen Gold-
berg, T.C. Wang, David Zimmerman (Kestrel Institute)
"Test Data Generation and Feasible Path Analysis," Robert Jasper, Mike
Brennan, Keith Williamson, Bill Currier (Boeing), David Zimmerman (Z-
Access Consulting)
PANEL: "Empirical Techniques for Assessing Testing Strategies Moderator:
Elaine Weyuker (AT&T Bell Laboratories and New York University) Panel-
ists: Richard Carver (George Mason University), Phyllis Frankl
(Polytechnic University), Jeff Offutt (George Mason University)
"Review of Methods and Relevance for Software Testing," Gregor von Boch-
mann, Alexandre Petrenko (Universite de Montreal) [INVITED PAPER]
"Visualization Using Timelines," Gerald Karam (Carleton University)
"TAOS: Testing with Analysis and Oracle Support," Debra Richardson
(University of California, Irvine)
"TOBAC: A Test Case Browser for Testing Object-Oriented Software," Ernst
Siepmann (Siemens Corporate Research), A. Richard Newton (University of
California, Berkeley)
"Selecting Tests and Identifying Test Coverage Requirements," for Modi-
fied Software Gregg Rothermel, Mary Jean Harrold (Clemson University)
"Efficient Mutation Analysis: A New Approach," Vladimir Fleyshgakker,
Stewart Weiss (Hunter College, CUNY)
"Confidence Oriented Software Dependability Measurement," W. E. Howden,
Yudong Huang (University of California, San Diego)
"The Incorporation of Testing into Verification: Direct, Modular, and
Hierarchical Correctness Degrees," Leo Marcus (Aerospace Corp.)
"The All Program Functions Criterion for Revealing Computation Errors,"
Istvan Forgacs (Hungarian Academy of Sciences)
"Testing A Safety-Critical Application," John Knight, Aaron Cass,
Antonio Fernandez, Kevin Wika (University of Virginia)
"An Experimental Approach to Analyzing Software Semantics Using Error
Flow Information," Branson Murrill (Virginia Commonwealth University),
Larry Morell (Hampton University)
"Debugging Optimized Code Via Tailoring," Lori Pollock (University of
Delaware), Mary Bivens (Allegheny College), Mary Lou Soffa (University
of Pittsburgh)
"A Meaningful Bound for Branch Testing," Antonia Bertolino (CNR, Pisa),
Martina Marre (Universita di Pisa and Universidad de Buenos Aires)
"State-Space Analysis as an Aid to Testing," Michal Young (Purdue
University)
"An Empirical Evaluation of Three Methods for Deadlock Analysis," of Ada
Tasking Programs James C. Corbett (University of Hawaii, Manoa)
"Testing Races in Parallel Programs with an OtOt Strategy," Suresh
Damodaran-Kamal, Joan Francioni (University of Southwestern Louisiana)
"Analysis of Real-Time Programs with Simple Time Petri Nets," Ugo Buy,
Robert Sloan (University of Illinois)
PANEL: "Views on Software Testability Moderator: Timothy Shimeall (Naval
Postgraduate School) Panelists: Michael Friedman (Hughes Aircraft), John
Chilenski (Boeing), Jeffrey Voas (Reliable Software Technologies Corp.)
CONTACT: Dick Hamlet, ISSTA `94
(hamlet@cs.pdx.EDU)
Department of Computer Science
Portland State University
Box 751
Portland, OR 97207 USA
========================================================================
Possible Testing Procedures for OO Developed Systems
Brian Marick
University of Illinois at Urbana
(marick@cs.uiuc.EDU)
(NOTE: This article appeared on the "comp.software.testing" newsgroup
on InterNet and is reproduced here with Dr. Marick's permission.)
Q: I have been hearing stories how current structured developed system
testing tech.s do not work for OO developed systems.
A: I disagree with these stories. Current testing techniques do apply
to object-oriented systems, but they need to be tailored to object-
oriented programming's characteristic features (inheritance, dynamic
binding). Further, because methods tend to be small, the proportion of
interface faults is likely to be higher; that suggests concentrating on
types of testing that target interface faults at the expense of other
types (such as path-based techniques, which are a bad idea even for con-
ventional software). But the techniques are tailored, not invented
afresh (fun though that is).
Here's an example. How do you design tests for this interface, found in
a conventional program?
result = operation(x);
if (FATAL_ERROR == result)
/* bail out */
else
/* operation successful - do next thing */
We know from numerous studies that a common type of fault is the fault
of omission. The above code contains a fault of omission. It should
read
result = operation(x);
if (FATAL_ERROR == result)
/* bail out */
else if (RECOVERABLE_ERROR == result) // oops - forgot this case
/* recover from error */
else
/* operation successful - do next thing */
You will find this fault if you know that operation() has three distinct
return values: fatal error, recoverable error, and success. You'll know
that from its specification (manpage, whatever). With a little prac-
tice, it's usually easy to pick out distinct classes of behavior that
should be tested at any call to a function. (Not all behaviors are
return values, note.) You can check whether the caller handles those
behaviors either by testing or inspections - take your pick. Of course,
everyone's life is simpler if those behaviors are determined once and
documented forever, so everyone doesn't have to think of them afresh
every time. I call that documentation a "test requirement catalog".
It's useful to both testers and programmers (who use it to avoid making
the omissions in the first place).
What are the implications of object-oriented programming? The code
looks like this:
result = x->operation();
More importantly, the class of X may not be known at runtime. Suppose
it could be one of class1 or class2. You now need to worry about the
behaviors of two functions: class1::operation() and class2::operation().
Note that if they have identical behaviors, you don't care which func-
tion is really called; the difference cannot affect the caller. But
that's not usually the case. More typically, the behaviors differ. You
would want to test that the calling code can handle any behavior from
either function. Moreover, when the superclass and subclass have dif-
ferent behaviors for the same set of inputs (bad design, usually), you
need to test if code written expecting the superclass can handle one
behavior where the other was expected. (And so on.)
This leads me to create a more complicated test requirements catalog for
a function, one that's indexed by declared class and provides test
requirements for all possible actual classes. (There are actually two
catalogs, one for dynamic binding, but I don't want to go into that.)
The major implications that object-oriented programming has for testing
are, I think, in test management and planning, not in test design tech-
niques. (At least, those are the things that I don't understand at all
well.) Examples:
1. A prototyping style of development is more common / more respectable
in OO systems (which I believe is largely a good thing). How can test-
ing fit in so that it finds bugs as soon as possible, doesn't get in the
way of development, and doesn't turn into a maintenance nightmare down
the road?
2. What I talked about above is (roughly) inheriting testing information
along with code. How is all this information to be managed, kept up to
date, made easy to use, etc? There are lots of reasons, technical and
social, why people don't reuse code; they surely all apply to reuse of
test information - probably more so.
-BEM
========================================================================
SOFTWARE QUALITY JOURNAL
An International Journal of Software Quality Research & Practice
--- CALL FOR PAPERS ---
Starting its third year, the Software Quality Journal, published by
Chapman & Hall, invites contributions from practitioners, academics, and
international policy- and standards-making organizations. The Journal
will consider research, technique, case study, survey and tutorial sub-
missions that address software quality related issues. Papers should be
submitted to:
Warren Harrison Ken Croucher
PSU Center for Software Quality Research 11 Runnymede
Portland State University Giffard Park
PO Box 751 Milton Keynes
Portland, OR 97207-0751 MK14 5QL
USA UNITED KINGDOM
FAX: 503-725-3211 FAX: +44 908 61 5750
warren@cs.pdx.edu
========================================================================
3rd International Conference On The Software Process (ICSP)
10-11 October 1994, Reston, Virginia, USA
Preliminary Conference Program -- Paper Summary
ICSP is sponsored by The International Software Process Association in
cooperation with ACM SIGSOFT and IEEE TC-SE
Thomas J. Allen, Sloan School of Management, MIT (author of ``Managing
the Flow of Technology''). (Invited Keynote Address)
Policies and Mechanisms to Support Process Evolution in PSEEs, Sergio
Bandinelli, Elisabetta Di Nitto, Alfonso Fuggetta,
A Reflective Approach to Process Model Customization, Enactment and Evo-
lution, Philippe Jamart and Axel van Lamsweerde
Toward Metrics for Process Validation, Jonathan E. Cook and Alexander L.
Wolf
Workshop Report: 9th International Software Process Workshop, Carlo
Ghezzi
Workshop Report: European Workshop on Software Process Technology '94,
Brian Warboys
Workshop Report: Japanese Software Process Workshop
A Collaborative Spiral Software Process Model Based on Theory W, Barry
Boehm and Prasanta Bose
The Personal Process in Software Engineering, Watts S. Humphrey
Mini-Tutorial: Capability Maturity Model, ISO-9000 and Baldridge, Brian
Nejmeh, INSTEP Inc.
Rapid Iteration in Software Process Improvement: Experience Report,
Kathleen Culver-Lozo
Modeling Method for Management Process and Its Application to CMM and
ISO9000-3, Katsuro Inuoue, Atsushi Watanabe, Hajimu Iida and Koji Torii
A Case Study in Modeling a Human-Intensive, Corporate Software Process,
Naser S. Barghouti and David S. Rosenblum
Elicit: An Empirically Improved Method for Eliciting Process Models,
Nazim H. Madhavji, Dirke Hoeltje, Won Kook Hong, Tilmann Bruckhaus
Organizational Congestion in Large-Scale Software Development, Karla V.
Ballman and Lawrence G. Votta
Engineering Software Design Processes to Guide Process Execution, Xiping
Song and Leon J. Osterweil
A Conceptual Schema for Process Definitions and Models James W. Armitage
and Marc I. Kellner
Classification of Meta-Processes and their Models, Minh N. Nguyen and
Reidar Conradi
Panel: Are Software Processes Business Processes Too? Chair: Mark Dow-
son,
For Further Information Contact: Dewayne Perry Tel: +1 908 582 2529
Email: dep@research.att.com
========================================================================
------------------->>> CALENDAR OF EVENTS <<<-------------------
========================================================================
The following is a partial list of upcoming events of interest. ("o"
indicates Software Research will participate in these events.)
+ July 26-28, 1994: Software Quality Management
Edinburgh, SCOTLAND
Contact: Sue Owen
Conference Secretariat,
Wessex Institute of Technology
Phone: 44 (0) 703 293223
========================================================================
--------->>> TTN SUBMITTAL POLICY <<<---------
========================================================================
The TTN On-Line edition is forwarded on the 15th of each month to sub-
scribers via InterNet. To have your event listed in an upcoming issue,
please E-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com".
The TTN On-Line submittal policy is as follows:
o Submission deadlines indicated in "Calls for Papers" should provide
at least a 1-month lead time from the TTN On-Line issue date. For
example, submission deadlines for "Calls for Papers" in the January
issue of TTN On-Line would be for February and beyond.
o Length of submitted items should not exceed 68 lines (one page).
o Publication of submitted items is determined by Software Research,
Inc., and may be edited as necessary.
========================================================================
-------------->>> TTN SUBSCRIPTION INFORMATION <<<--------------
========================================================================
To request a FREE subscription or submit articles, please E-mail
"ttn@soft.com". For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header. To have
your name added to the subscription list for the quarterly hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.
To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.
Note: To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".
TESTING TECHNIQUES NEWSLETTER
Software Research, Inc.
901 Minnesota Street
San Francisco, CA 94107 USA
Phone: (415) 550-3020
Toll Free: (800) 942-SOFT
FAX: (415) 550-3030
E-mail: ttn@soft.com
## End ##