sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= Testing Techniques Newsletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======            October 1998             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), Online Edition, is E-mailed monthly
to support the Software Research, Inc. (SR)/TestWorks user community and
to provide information of general use to the worldwide software quality
and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of TTN-Online provided that the
entire document/file is kept intact and this complete copyright notice
appears with it in all copies.  (c) Copyright 1998 by Software Research,
Inc.


========================================================================

INSIDE THIS ISSUE:

   o  2nd International Quality Week Europe, QWE'98 (9-13 November 1998,
      Brussels, Belgium) -- Complete Technical Program

   o  Testing for Y2K Compliance: A Case Study, by Jim Williams

   o  A Review of the V Model, by Morton Hirschberg

   o  QW'99 Call for Papers

   o  SIG on Quality Assurance Being Formed, by Kenneth D. Shere

   o  Humor: It's About Teaching Math in 199?

   o  International Conference on Software Maintenance, 16-20 November
      1998.

   o  Correction

   o  TTN Submittal Policy

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

         Quality Week Europe (QWE98) -- Final Technical Program

                Brussels, Belgium -- 9-13 November 1998


Here is the complete QWE'98 Tutorial Schedule and Technical Conference
lineup.  Complete information about QWE'98 is available from the
website:

                  <http://www.soft.com/QualWeek/QWE98>

or by Email from qw@soft.com.

MONDAY TUTORIAL SPEAKERS...

    Dr. Boris Beizer: An Overview of Testing -- Unit, Integration,
        System (A1).
    Dr. Boris Beizer: Testing and Y2K (A2).
    Ms. Suzanne Robertson: Making Requirements Testable (B1) (B2).
    Dr. Gualtiero Bazzana & Dr. E. Fagnoni: Testing Internet/Intranet
        Applications (C1).
    Mr. Thomas Drake: Measuring Quality in Object-Oriented Software
        (C2).
    Mr. Martin Pol: Test Process Improvement (D1) (D2).

TUESDAY TUTORIAL SPEAKERS...

    Mrs. Dorothy G. Graham: Software Inspection (E1).
    Mr. Bart Broekman & Mr. Christiaan Hoos: Test Automation, Eternal
        Struggle or Lasting Benefits? (E2).
    Mr. Robert V. Binder: Modal Testing Strategies for Object-Oriented
        Systems (F1) (F2).
    Dr. Linda Rosenberg & Mr. Ted Hammer: Metrics for Quality Assurance
        and Risk Assessment (G1).
    Dr. John D. Musa: More Reliable, Faster, Cheaper Testing with
        Software Reliability Engineering (G2).
    Mr. Ed Kit: Automating Software Testing and Reviews (H1) (H2).

KEYNOTE SPEAKERS...

    Dr. John D. Musa: Applying Operational Profiles to Testing + ISSRE
        Results Summary (K1).
    Mr. Bill Eldridge: EMU: The Impact on Firms' Global Operations (K2).
    Mrs. Dorothy G. Graham: Inspection: Myths and Misconceptions (K3).
    Mr. David Talbot: EC Commission Actions for Y2K and EURO (K4).
    Dr. Boris Beizer: Nostradamus Redux (K5).

TECHNOLOGY TRACK SPEAKERS...

    Mr. Rene Weichselbaum: Software Test Automation (1T).
    Mr. James Clarke: Automated Test Generation From a Behavioral Model
        (2T).
    Ms. Brigid Haworth: Adequacy Criteria for Object Testing (3T).
    Mr. Bill Bently & Mr. Robert V. Binder: The Dynamic Information Flow
        Testing of Objects: When Path Testing Meets Object-Oriented
        Testing (4T).
    Dr. Denise Woit & Prof. David Mason: Component Independence for
        Software System Reliability (5T).
    Dr. Linda Rosenberg, Mr. Ted Hammer & Ms. L. Hoffman: Testing
        Metrics for Requirement Quality (6T).
    Mr. Hans Buwalda: Testing with Action Words, a Quality Approach to
        (Automated) Software Testing (7T).
    Mr. Jon Huber: Software Defect Analysis: Real World Testing
        Implications & A Simple Model for Test Process Defect Analysis
        (8T).
    Prof. Antonia Bertolino & Ms. E. Marchetti: A Simple Model to
        Predict How Many More Features Will Appear in Testing (9T).
    Dr. Stacy J. Prowell: Impact of Sequence-Based Specification on
        Statistical Software Testing (10T).
    Dr. Matthias Grochtmann & Mr. Joachim Wegener: Evolutionary Testing
        of Temporal Correctness (11T).
    Ms. Martina Marre, Ms. Monica Bobrowski & Mr. Daniel Yankelevich: A
        Software Engineering View of Data Quality (12T).

SOLUTIONS TRACK SPEAKERS...

    Mr. Manuel Gonzalez: System Test Server Through the Web (1S).
    Dr. Istvan Forgacs & Mr. Akos Hajnal: Automated Test Data Generation
	to Solve the Y2K Problem (2S).
    Mr. Felix Silva: Product Quality Profiling: A Practical Model to
        Capture the Experiences of Software Users (3S).
    Mr. Otto Vinter: Improved Requirements Engineering Based On Defect
        Analysis (4S).
    Mr. Robert J. Poston: Making Test Cases from Use Cases Automatically
        (5S).
    Mr. Avi Ziv & Dr. Shmuel Ur: Off-The-Shelf vs. Custom Made Coverage
        Models, Which Is The One For You? (6S).
    Mr. Howard Chorney: A Practical Approach to Using Software Metrics
        (7S).
    Mr. Lionel Briand, Mr. Bernd G. Freimut, Mr. Oliver Laitenberger,
        Dr. Gunther Ruhe & Ms. Brigitte Klein: Quality Assurance
        Technologies for the EURO Conversion -- Industrial Experience at
        Allianz Life Assurance (Panelist) (8S).
    Mr. Jakob-Lyng Petersen: An Experience In Automatic Verification for
        Railway Interlocking Systems (9S).
    Mr. Tom Gilb: Risk Management Technology: A rich practical toolkit
        for identifying, documenting, analyzing and coping with project
        risks (10S).
    Dr. Peter Liggesmeyer, Mr. Michael Rettelbach & Mr. Michael Greiner:
        Prediction of Project Quality by applying Stochastical
        Techniques to Metrics based on Accounting Data: An Industrial
        Case Study (11S).
    Mr. John Corden: Year 2000 -- Hidden Dangers (Panelist) (12S).

MANAGEMENT TRACK SPEAKERS...

    Mr. Leslie A. Little: Requirements Management -- Simple
        Tools...Simple Processes (1M).
    Mr. Nathan Petschenik: Year 2000: Catalyst for Better Ongoing
        Testing (2M).
    Mr. Juan Jaliff, Mr. Wolfgang Eixelsberger, Mr. Arne Iversen & Mr.
        Roland Revesjf: Making Industrial Plants Y2K-ready: Concept and
        Experience at ABB (3M).
    Mr. Staale Amland: Risk Based Testing (4M).
    Mr. Graham Titterington: A Comparison of the IT Implications of the
        Y2K and the EURO Issues (Panelist) (5M).
    Mr. L. Daniel Crowley: Cost of Quality -- The Bottom Line of Quality
        (6M).
    Dr. Erik P. VanVeenendaal: Questionnaire Based Usability Testing
        (7M).
    Mr. Gorka Benguria, Ms. Luisa Escalante, Ms. Elisa Gallo, Ms.
        Elixabete Ostolaza & Mr. Mikel Vergasa: Staged Model for SPICE:
        How to Reduce Time to Market -- TTM (8M).
    Mr. Antonio Cicu, Mr. Domenico Tappero Merlo, Mr. Francesco Bonelli,
        Mr. Fabrizio Conicella & Mr. Fabio Valle: Managing Customer's
        Requirements in a SME: A Process Improvement Initiative Using a
        IT-Based Methodology and Tool (9M).
    Mr. Thomas Drake: The Euro Conversion -- Myth versus Reality!
        (Chairman, Panel Session) (10M).
    Mr. Mark Buenen: Introducing Structured Testing in a Dynamic, Low-
        Maturity Organization (11M).
    Ms. Elisa Gallo, Mr. Pablo Ferrer, Mr. Mikel Vergasa & Chema Sanz:
        SW CMM Level2: The Hidden Structure (12M).

VENDOR TECHNICAL TRACK SPEAKERS...

    Mr. Charles Crawford: Year 2000 and the euro: Compliance Testing and
        Data Management (1V).
    Dr. Edward F. Miller: Remote Testing Technology (RTT) (3V) .
    Dr. Luc VanHamme: Results of the ESSI PIE Project OMP/CAST (6V).
    Dr. Boudewijn Schokker: Visions and Tools (7V).
    Dr. Edward F. Miller: WebSite Validation Technology: Assuring E-
        Commerce Quality (8V) .
    Mr. Bob Bartlett: Building Re-Usable Test Environments for Y2K and
        EMU / EURO Testing (9V).
    Mr. Ido Sarig: EMU Conversion -- Test Reality Before Reality Tests
        You... (10V).

========================================================================

                Testing for Y2K Compliance: A Case Study

                                   by

                           Jim Williams, CQA
                          US Computer Services
                               CableData

   ABSTRACT:  Automated testing methods are used to confirm Y2K
   compliance for a complex, multi-module, multi-platform application
   under test (AUT).  The 2-year effort resulted in some 40,000 test
   scripts and slightly over 400,000 saved comparison images.
   Excellent pay-back from test mechanization and highly enhanced
   confidence in the AUT are the results.

                              INTRODUCTION

The application under test (AUT) is operated by an international
corporation and primarily includes, software for ordering, tracking,
billing, and otherwise processing Cable TV & Telephony technical and
financial information.  The systems under test include a wide range of
applications from customer billing and associated business management to
controllers and line-feed electronics, plus some cell-phone support
systems.  Obviously, for these kinds of applications Y2K compliance is
critical -- interruptions of service and other potential problems have
very high economic impact to CableData's business.

There are about 150 applications that run under UNIX on IBM RS/6000
machines, and about 110 applications that run on a Tandem machine.  The
total number of lines of code is in excess of several million -- i.e.
something over 2500 KLOC.  Both sets of applications are accessible from
PCs -- work-stations that connect to the RS/6000 or Tandem machine
clusters via on-board terminal emulators.  Almost all of the
applications produce results that are seen as responses on the PCs.

                          Y2K TEST METHODOLOGY

The main method used for Y2K compliance certification is the
conventional data-aging method: the application is tried with time-
intervals that vary in length over a series of dates before, near to,
just after, and much later than 31 December 1999.

To get results in this way, you have to "age the database". We developed
special software just for this purpose that helps maintain a coherent
database of information, aged to the desired date that the Y2K tests are
actually run against.

SQL Relational Databases are used in both products, so a special tool
was used to verify and validate all SQL activity to a known baseline
after each suite of tests had been run.

Some of the software that is used in the system has to be Y2K compliance
checked first.  This is done to make sure that any defects or anomalies
that are found in the specific AUT tests are caused by fixable errors,
rather than by errors in vendor supplied code such as databases,
operating systems, system utilities, etc.

The date changes that were used include the following nine dates:

      09/09/99        Four nines (sometimes 9/9/99)!

      12/31/99        THE date in question!

      01/01/00        The day after THE date in question.

      02/29/00        Yes, a leap year!

      03/01/00        The day after the prior date.

      12/31/00        A year after THE date in question.

      01/01/01        All Ones!

We did two additional dates - just to see what would happen:

      12/31/37        UNIX Dates OK.

      01/01/38        UNIX Dates maybe not OK.

Note that the UNIX dates are included in the testing to attempt to
conclude that the systems are OK, for the time when the 64-bit UNIX
long-integer date number of seconds since 1 January 1970 "turns over".

                            THE TEST SCRIPTS

The test scripts are many in number and very complex.  They were only
partially generated manually. Most of them were "constructed" from
pieces that were engineered to be put together.  This is a kind of
tree-oriented test script generation that we were able to use with great
success.

All automated testing scripts were validated using a "Path Coverage"
tool, obtaining an average Path Coverage of 80%.

For the RS/6000 AIX testing we used a total of 22,000 test scripts that
resulted in a set of 280,000 different "screen shot" images that are
used as the basis for comparison.  In some cases we had to program the
differencing engine to ignore unimportant differences. Even though this
was a lot of work at the outset it saved hundreds of hours of
unnecessary work as we ran the tests.

On the Tandem side we have a total of around 18,000 scripts and about
124,000 screen images.  As with the AIX testing we used some special
processing to make the job of comparison easier.

The scripts varied in length from 50-60 lines (some of the simplest
tests) to over 12,000 lines (the most complex tests).

About 40% of the tests -- mostly the smaller ones -- were generated by
hand, but the 40% that were mechanically generated probably account for
80%+ of the total volume of tests.

                           RUNNING THE TESTS

The tests were run from a cluster of 12 PCs.  Typically we would run
tests for 12 hours a day, 7 days a week.  Even though we had 12 machines
to drive the tests, on average we used as few as 4 and only infrequently
had to use all 12 machines.  A good estimate is that we used about 8 of
the 12 PCs for each set of test runs.

The AIX side suite of tests took about 7 days to run to completion, and
the Tandem side tests took just less than 10 days to execute.

Remember however, there is significant rework time when a test uncovers
one or more Y2K deficiencies.  We estimate that this is an additional
10%-20% overall.

Each test suite has been run a total of fourteen (14) times for each
platform.  This is based on 7 executions of the suite one each for the
above-defined dates before Y2K modifications, and the corresponding 7
executions for the AUTs after they had been modified.

We have had as many as 13 people on the test team, but the average is
about 7 for the approximately two years it has taken to build, debug,
run and re-run the tests.

                      RESULTS AND RECOMMENDATIONS

Our results have been very, very good so far.  We have had no major
problems and only a few minor problems detected due to the Y2K
conversion and certification modifications.

We did find some collateral defects -- some cases relate to dates and
times but are not critical for Y2K reasons.  And, the vast majority of
the Y2K changes we made were confirmed without difficulty.

Overall, we found and fixed around 100 of these non-Y2K "discoveries".

In conclusion, we have, we believe, very high confidence that our
systems will sail through the Y2K transitions without difficulty, making
it very easy for our Customers to start the next millennium with the up
most confidence in our software.

========================================================================

                        A Review of the V Model
                                   by
                           Morton Hirschberg

Introduction

The author is the Technical Project Officer (TPO) for the Data Exchange
Agreement for Software Technology between the United States and Germany.
Alternating yearly, the respective TPO of each country hosts a technical
review where mutual generic software accomplishments are discussed and
plans for further technical exchanges and interactions are determined.

It was in this capacity that the author became aware of the German
software standards for the German Federal Armed Forces, know as the V
Model (V for Victory).  The standards which are quite comprehensive are
published in three volumes.  Each volume is comprised of a set of
related directives and guidelines.

The standards which can be tailored to fit officially sponsored work
have also been published in less formal versions suitable for use by any
German business or academic institution.  Such use is both encouraged
and encountered.  Finally, the formal standards are published in an
English version.

It is the author's intention to introduce as briefly as possible the
standards to the readership.  The intent is to give a flavor of the
standards and encourage the reader to learn more about the standards
used by our ally.  Moreover, good and usable features may be culled for
use in our standards and directives.

It is not the author's intention to contrast the V Model with U.S.
standards and directives, nor to comment about their use by the German
Federal Armed Forces.  The characterizations below, such as strengths,
are loosely quoted from summaries written by Herr Fritz Haertel, one of
the architects of the V Model.  (See Reference at end of article.)  -M.
Hirschberg, September 1998

                              The V Model

The V Model is a series of General Directives (250, 251, and 252) which
prescribe/describe the Procedures, Methods to be Applied, and the
Functional Requirements for Tools to be Used in developing software
systems for the German Federal Armed Forces.  The remainder of this
paper discusses each in turn.

General Directive 250.  August 1992.  Software Lifecycle Process Model.

The objective of this directive is to regulate the software development
process by means of its uniform and binding set of activities and
products which are required during software development and its
accompanying activities.  Use of the V Model helps to achieve: 1)
improvement and warranty of software, 2) reduction of software costs for
the entire lifecycle, and 3) improvement of communications among the
different parties as well as the reduction of the dependence of the
customer upon the contractor.  It deals with procedure or what is to be
done, methods or how it is to be done, and tool requirements or with
what it is to be done.  Its main advantage is that it can be generally
applied.  The V Model fits into the international picture by fulfilling
requirements of NATO standards, ISO 9000 technical standards, and the
structure of the EURO-METHOD.  None of these is discussed here but could
be featured in subsequent articles.

The V Model organizes all activities and products and describes
activities and products at different levels of detail.  In the V Model,
products take on one of four states: 1) planned, 2) being processed, 3)
submitted, or 4) accepted.  Planned is the initial state of all
products.  Being processed is either in private development or under the
control of the developer.  Submitted means it is completed and now ready
for quality assessment.  It can return to the processing stage if
rejected or advance to accepted for release.  While seemingly
prescriptive, the V Model allows for tailoring throughout the product
lifecycle.  That is one of its strengths.

The V Model is composed of four submodels: 1) Software Development, 2)
Quality Assurance, 3) Configuration Management, and 4) Project
Management.  The submodels are closely interconnected and mutually
influence one another by exchange of products and results.  Software
Development develops the system or software.  Quality Assurance submits
requirements to the other submodels and test cases and criteria to
assure the products and the compliance of standards.  Configuration
Management administers the generated products.  Project Management
plans, monitors, and informs the other submodels.  Each of these
submodels can be further decomposed.  For instance, the Software
Development submodel can be broken down as follows:  system requirements
analysis and design, data processing requirements analysis and design,
software requirements analysis, preliminary design, detailed design,
implementation, software integration, data processing integration, and
system integration.

The Directive contains very detailed information (rules) on the
activities of each submodel showing the product flow, handling, and if
warranted recommendations.  As an example, if one were in the planning
stage, the product flow is from an external specification to a submitted
state.  Handling consists of several steps: organization planning, cost
and scheduling, resource planning, and risk considerations.  There may
also be peculiarities that need to be addressed.  Recommendations that
might be considered are:  use of matured resources and an experienced
staff, correct membership participation in cost and planning scheduling,
consideration of alternative problem solutions, how to handle unexpected
events, and costs for management activities and coordination activities.
Occasionally, the Directive also includes further explanations.

It should be noted that prototyping can be (is) used to verify and
detail requirements.  Prototyping allows for early completion, as an aid
in refining requirements, feasibility, and testing.

Each Directive has a set of appendices containing definitions, a list of
abbreviations, a list of illustrations, a bibliography, a
characterization of the roles of the products, a list of activities to
be performed, a list of products, an index and annexes.

                     Directive 250 has Two Annexes.

Annex 1.  The purpose of Annex 1 is to provide explanations of the
application of the V Model.  It is to support the user and is not of a
binding nature.  The objective of the V Model is to submit the products
created during software development, maintenance, and modification to
certain standards.  This is to guarantee a minimum quality of the
results and to make it easier to control the product stages from
requirements definition to the final product itself.  The V Model offers
support as a guide, as a checklist, and for the quality goal definition.
The V Model allows for tailoring, defines required products, and
establishes criteria for assessment.

Two kinds of applications have basically been intended for the V Model -
as a basis for contracts and as a guidance for development.  The V Model
makes provisions for the use of commercial, non-developed items, and
commercial off the shelf software (COTS).  It also provides for
Information Technology projects.

Annex 1 also serves to provide what elements may be deleted from the
model.

Annex 2 is an Explanation of the Products.  It deals with reports and
software to be produced.  This includes requirements, architectures, and
design.  It covers user's, diagnostic, and operator's manuals.

It too is broken down by submodel: Software Development, Quality
Assurance, Configuration Management, and Project Management.

       General Directive 251.  September 1993.  Methods Standard.

The objective of this standard is to set down all the tasks and results
of the software development process.  Standardization is done on three
levels:  procedure, methods to be applied, and functional requirements
on tools to be used.  While the V Model answers "What is to be done",
the Methods Standard answers, "How it is to be done."

Over 30 basic methods or categories of methods are listed in the
standard.  These are, for example, bar chart, tree diagram, decision
table techniques, E(ntity)/R(elationship) modeling, normalization,
object design technique, simulation models, and structured design.

The Methods Standard includes allocation tables listing those basic
methods that are best suited to realize certain activities or to develop
products according to the latest state of the art and by observing the
following criteria: quality improvement, economy, and maintainability.
For each method referenced in the allocation tables, the standard
describes the features an applied method must include to reach the
standard.  In many instances a complex method may be required.  This may
represent a well defined combination of several basic methods.

Basic methods really refer to procedures that describe a special,
limited aspect of a system, such as, functionality, data orientation,
analysis, preliminary design or one of the activities - quality
assurance, configuration management, or program management.  Complex
methods usually cover several aspects of a system.

Basic methods must be applied unless limiting conditions make the
application of the method impractical or explicit arguments are made
against the method or for an alternative method in a special case.

Each method listed includes the following information: identification/
definition, characteristics, limits, specification, interfaces, and a
list of references.  The Methods Standard is not meant to be a methods
manual.  With regards to tools, a method may be applied in different
versions depending upon the manufacturers.  For this reason, tool
independent definitions are set up.

Allocation tables exist for software development, quality assurance,
configuration management, and project management.

Modification to the Methods Standard can be made by a Change Control
Board which meets annually and is comprised of members from industry and
government.

In addition to the main part of the Method Standard, there are two
annexes.

Annex 1  provides an explanation of the methods; the method interfaces
including a characterization of the interface, an example of the
interface, tool support, and relevant literature; and a description of
the methods.  The explanation of methods contains information about
technical and operational software inspection and walk throughs.

Annex 1 specifically talks about object design technique and
configuration management.  It further contains a section on estimation
models (e.g. function point method, constructive cost model or COCOMO),
simulation models (e.g.  continuous and discrete {e.g. time-controlled,
event driven, activity oriented, process oriented, or transaction
oriented}), system behavior models (e.g. Petri networks, state charts,
specification and description language), and reliability models (e.g.
statistic {e.g. Okumoto, execution time, Logarithmic Poisson, Jelinski-
Moranda, and Schick and Wolverton}, and error seeding).

Annex 2 is a help when applying complex methods in connection with the
software development standard.  This annex describes the most important
methods for application in German national projects.  The methods listed
are:

      (1) GRAPES - Graphical Engineering System
      (2) IEM - Information Engineering Method
      (3) ISOTEC - Integrated Software Technology
      (4) PERFORM - The quality management system of the CAP Gemini Group
      (5) SA & SA/RT - Structured Analysis and SA with Real Time Extensions
      (6) SDL - Specification and Design Language
      (7) SEtec - Software Engineering Technology
      (8) SSADM - Structured Systems Analysis and Design Method

For each of the above, a brief description, tabular comparison with the
basic methods, specification of the allocation, and relevant literature
is given.

 General Directive 252.  September 1993.  Functional Tool Requirements
                   (Standardized Criteria Catalogue).

The goal of this Standard is to constrain the variety of applied methods
and tools that can be employed during the software lifecycle.  While the
V Model answers, "What is to be done" and the Methods Standard answers,
"How it is to be done", the Functional Tool Requirements answers, "With
what it is to be done?"

The Standard increases the guarantee for software quality (higher
quality products, lower risk), minimizes software cost for the entire
lifecycle, imposes communication among the different parties, and
reduces dependence of the customer on the contractor.  The latter is
accomplished through its recommendations, focused approach and required
prescriptions.

The Standard introduces the Software Development Environment (SDE)
Reference Model where SDE is defined as the totality of all technical
resources utilized for the professional software development.  A tool is
defined as a software product supporting the generation or maintenance
or modification of software.

The structure of the SDE Reference Model is: user interface, work flow
management, security and integrity requirements, software development,
quality assurance, configuration management, project management, and
object management.  The description of the fundamental units or
criteria, or as they are referred to in the Standard, service units, are
laid out as follows: allocation to the V Model and Methods Standards,
brief characteristics, and finally, requirements.

The Reference Model puts all the technical data processing services
offered into a basic schema.  In all, 58 service units are defined and
explained.  A service unit can cover exactly one method or several
methods.  It should be noted that a method can be covered by one or more
service units or not covered at all.  In addition, there may be other
requirements that are not based on a method.  Finally, the Methods
Standard may not suggest a method for the item under consideration.

Some examples of service units are:  from the user interface - help
functions; from software development - generating data bases, compiling,
and debugging; from quality assurance - static assessment of the code;
and from project management - cost estimation.

An example of a service unit schema, for cost estimation is as follows:
allocation - planning, detailed planning and estimation models; brief
characteristics - requirements on tools to support the cost estimation
realized by basis of already available empirical values from earlier
projects, project specific marginal conditions, and by assumptions of
future developments.  For requirements - granularity, input and output
interfaces to other service units, estimation models for fixed and
variable costs and other requirements such as an experience data base.

             The Standard has an Appendix and two annexes.

An important part of the Appendix is the relationship between the V
Model and the European Computer Manufacturers Association (ECMA)
Reference Model.  The services in the ECMA Reference Model are: object
management, user interface, process management, policy enforcement, and
communication.  Tools per se are not further specified in the ECMA
Reference Model.  There is no strict one to one correspondence between
the V Model and ECMA Reference Model.

Finally, Annex 1 supports the user in his work with functional tool
requirements by means of tabular overviews and applications scenarios.
The latter covers the definition of the functional requirements on a
tool, the selection of tools for setting up an SDE, tool evaluation, and
the tool environment in a customer/contractor relationship.

Annex 2 is used as an introduction into the basics of SDE data
management and also to offer an overview of standards for the
integration of tools with regard to data, control information, and user
interface.

Data management is handled through the use of data models.  The real
world is first portrayed in a conceptual design from which a logical
design of relevant features is developed.

Annex 2 provides definitions of a data dictionary, repository, and
development data base.

Finally, the Annex deals with Standards.  Not all requirements can be
met by a single tool, so an SDE is only possible if tools can be
integrated into a uniform environment.  Such integration has three
aspects: data integration, control integration and uniform user
interface.  The concentration is on data integration.

Several standardization efforts in the DP industry are discussed,
including those of the Object Management Group (OMG).

                             Model Summary

In summary, the V Model, Methods Standard, and Tool Standard present
complete coverage of the functional areas (Software Development, Quality
Assurance, Configuration Management, and Project Management), provides
concrete support, is sophisticated and yet flexible and balanced, has a
wide spectrum and is publically controlled under the supervision of a
Change Control Board.  Improvements as well as corrective changes are
handled through the Control Board.

Their advantage is improved communications among project members,
uniform procedures, guarantee of a better product, productivity
increases, better choice of methods, adaptability, reduced risk, lowered
training costs, and decreased maintenance.

                               Conclusion

It is my hope that the Models presented can serve as both a catalyst and
framework for discussion of Standards Methodologies for the DoD.  It
should be noted that the German Ministry of Defense while similar to the
DoD is much more homogeneous than the DoD.  Perhaps this is a major
contribution to the use of their V Model.

                               Reference

The Vmodel is described (in German) at the URL:

                     <http://www.v-modell.iabg.de>

IABG the contractor for the German Ministry of Defense which maintains
the site.

========================================================================

        TWELFTH INTERNATIONAL SOFTWARE QUALITY WEEK 1999 (QW'99)

                   CALL FOR PAPERS AND PRESENTATIONS

                  Conference Theme: Facing the Future

          San Francisco Bay Area, California -- 24-28 May 1999

QW'99 is the twelfth in the continuing series of International Software
Quality Week Conferences that focus on advances in software test
technology, reliability assessment, software quality processes, quality
control, risk management, software safety and reliability, and test
automation.  Software analysis and verification methodologies and
processes, supported by automated software analysis and test tools,
promise major advances in system quality and reliability.

The mission of the QW'99 Conference is to increase awareness of the
entire spectrum of methods used to achieve software quality.  QW'99
provides technical education, with opportunities for practical
experience exchange, for the software development and testing community.

The QW'99 theme "Facing the Future" draws attention to the impact of the
Y2K and EURO conversion/verification problems on the entire software
quality area.  The aim is to focus attention on finding out what are the
the right things to do for software quality in the coming decade.

The QW'99 program consists of two days of pre-conference tutorials,
followed by a three-day conference including Mini-Tutorials, Quick-Start
talks, Panel Sessions, and regular Technical Presentations.  QW'99
provides the Software Testing and QA/QC community with:

      o  Carefully chosen 1/2-day and full-day tutorials from well-known
         technical experts.
      o  Three-Day Four-Track (Technology, Applications, Process, Tools
         & Solutions) Technical Conference
      o  Special Quick-Start and Mini-Tutorial Sessions
      o  Two-Day Vendor Show/Exhibition
      o  Vendor Technical Presentations and Demonstrations
      o  Analysis of method and process effectiveness through case
         studies.
      o  Meeting of Special Interest groups.
      o  Exchange of critical information among technologists.
      o  State-of-the-art information on software test methods.

QW'99 is soliciting 45 and 90 minute presentations, half-day and full-
day standard seminar/tutorial proposals, 90-minute mini-tutorial
proposals, or proposals participation in a panel and "hot topic"
discussions on any area of testing and automation, including:

      Application of Formal Methods
      Automated and Manual Inspection Methods
      CMM/PMM Process Assessment
      Data Flow Testing Technology
      Defect Tracking / Monitoring
      GUI Test Technology and Test Management
      Integrated Test Environments
      ISO-9000 Application and Methods
      New and Novel Test Methods
      Process Assessment/Improvement
      Productivity and Quality Issues
      Object Oriented Testing
      Real-Time Software
      Real-World Experience
      Reliability Studies
      Software Metrics in Test Planning
      System Load Generation and Analysis
      Test Automation Technology and Experience
      Test Data Generation Techniques
      Test Documentation Standards
      Test Management Automation
      Test Policies and Standards
      Web Testing/WebSite Quality
      Year 2000 Issues

IMPORTANT DATES:

      Abstracts and Proposals Due:            18 December 1998
      Notification of Participation:          20 February 1999
      Camera Ready Materials Due:             31 March 1999

FINAL PAPER LENGTH:

      Papers should be limited to 10-20 pages, including Text, Slides
      and/or View Graphs.

SUBMISSION INFORMATION:

      Abstracts should be 2-4 pages long, with enough detail to give
      members of QW'99's International Advisory Board an understanding
      of the final paper/presentation, including a rough outline of its
      contents.  FAX your proposal to us, or send it (by Email to
      qw@soft.com) as an ASCII file or a Microsoft Word 6.0 format
      document (as a MIME attachment), or in PostScript file, or in a
      PDF format file.  Please indicate if the most likely audience is
      technical, managerial/process, applications, or tools and
      solutions oriented.

      In addition, please include:
         o  A cover page with the paper title, complete mailing and
            Email address(es), and telephone and FAX number(s) of each
            author.
         o  A list of keywords describing the paper contents.
         o  A brief biographical sketch of each author.

      Send abstracts and proposals including complete contact
      information to:

            Ms. Rita Bral
            Quality Week '99 Director
            Software Research Institute
            901 Minnesota Street
            San Francisco, CA  94107 USA

INFORMATION

      For complete information on the QW'99 Conference, send Email to
      qw@soft.com, phone SR Institute at +1 (415) 550-3020, or, send a
      FAX to SR/Institute at +1 (415) 550-3030.

      Candidate product/service vendors should contact the QW'99 team
      early as exhibit space is strictly limited.

      Complete information about QW'99 is available at the QW'99
      Conference WebSite:

              <http://www.soft.com/QualWeek/QW99>

========================================================================

                 SIG On Quality Assurance Being Formed

I am in the process of trying to organize a special interest group on
Quality Assurance for the Software Engineering Technical Council of the
IEEE Computer Society.  If there is anybody in your organization who is
intested in helping, please ask them to contact me.

                          Dr. Kenneth D. Shere
                              703-633-5331
                        kenneth.d.shere@aero.org

========================================================================

                HUMOR: IT'S ABOUT TEACHING MATH IN 199?

Teaching Math in 1950: A logger sells a truckload of lumber for $100.
His cost of production is 4/5 of the price.  What is his profit?

Teaching Math in 1960: A logger sells a truckload of lumber for $100.
His cost of production is 4/5 of the price, or $80.  What is his profit?

Teaching Math in 1970: A logger exchanges a set "L" of lumber for a set
"M" of money. The cardinality of set "M" is 100. Each element is worth
one dollar. Make 100 dots representing the elements of the set "M".  The
set "C", the cost of  production, contains 20 fewer points than set "M."
Represent the set "C" as a subset of set "M" and answer the following
question: What is the cardinality of the set "P" for profits?

Teaching Math in 1980: A logger sells a truckload of lumber for $100.
Her cost of production is $80 and her profit is $20.  Your assignment:
Underline the number 20..

Teaching Math in 1990: By cutting down beautiful forest trees, the
logger makes $20.  What do you think of this way of making a living?
Topic for class participation after answering the question:  How did the
forest birds and squirrels feel as the logger cut down the trees? There
are no wrong answers..

Teaching Math in 1996: By laying off 40% of its loggers, a company
improves its stock price from $80 to $100.  How much capital gain per
share does the CEO make by exercising his stock options at $80? Assume
capital gains are no longer taxed, because this encourages investment..

Teaching Math in 1997: A company out-sources all of its loggers. The
firm saves on benefits, and when demand for its product is down, the
logging work force can easily be cut back.  The average logger employed
by the company earned $50,000, had three weeks vacation, a nice
retirement plan and medical insurance. The contracted logger charges $50
an hour.  Was outsourcing a good move?

Teaching Math in 1998: A laid-off logger with four kids at home and a
ridiculous alimony from his first failed marriage comes into the
logging-company corporate offices and goes postal, mowing down 16
executives and a couple of secretaries, and gets lucky when he nails a
politician on the premises collecting his kickback.  Was outsourcing the
loggers a good move for the company?

Teaching Math in 1999: A laid-off logger serving time in Folsom for
blowing away several people is being trained as a COBOL programmer to
work on Y2K projects.  What is the probability that the automatic cell
doors will open on their own as of 00.00:01, 01/01/00?

========================================================================

    International Conference on Software Maintenance Advance Program
                Bethesda, Maryland, November 16-20, 1998

        Theme: COTS Application and Component-Based Maintenance

As the next century approaches, systems are increasingly comprised of
components. Emerging techniques such as commercial-off-the-shelf (COTS)
software packages, Mega-Reuse, and COTS components will alter the
practices of software maintainers. While traditional maintenance has, in
large part, been a microcosm of development practices, COTS software
package life cycles bring a different set of challenges for the
maintenance community.

COTS packages have life cycles for both their vendors and the
organizations that purchase them for integration into their applications
development. COTS components and applications increase the uncertainties
that software maintainers face.

Adding to the challenge, latent Year 2000 (Y2K) problems are likely to
reverberate as existing systems are transitioned into the next century.
Maintaining software systems in the 21st Century world of component-
based software engineering will be challenging and exciting.  ICSM'98 is
a forum for exploring the implications of COTS applications, component-
based maintenance, Year 2000, and more.

ICSM'98 attendees have the opportunity to attend three other conferences
that are being held back-to-back with ICSM'98 and at the same location:

IEEE High-Assurance Systems Engineering Symposium HASE'98 (Nov. 13-14):
                    <http://www.isr.wvu.edu/hase98/>

Workshop on Empirical Studies in Software Maintenance - WESS'98 (Monday,
Nov. 16),
                 <http://www.cs.umd.edu/~sharip/wess/>

International Symposium on Software Metrics - Metrics'98 (Nov. 20-21)
                <http://aaron.cs.umd.edu:80/metrics98/>

========================================================================

                               CORRECTION

Due to a production error some issues of TTN-Online for September 1998
that were Emailed around 16-17 September 1998 were incorrectly labeled
as the "August 1998" edition.  If you received such a copy please make
the appropriate correction.  We apologize for any inconvenience.

========================================================================


========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN Online Edition is E-mailed around the 15th of each month to
subscribers worldwide.  To have your event listed in an upcoming issue
E-mail a complete description and full details of your Call for Papers
or Call for Participation to "ttn@soft.com".

TTN On-Line's submittal policy is as follows:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the TTN On-Line issue date.  For
  example, submission deadlines for "Calls for Papers" in the January
  issue of TTN On-Line would be for February and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK and may be serialized.
o Length of submitted calendar items should not exceed 60 lines (one
  page).
o Publication of submitted items is determined by Software Research,
  Inc. and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters; TTN-Online disclaims any responsibility for their content.

TRADEMARKS:  STW, TestWorks, CAPBAK, SMARTS, EXDIFF, Xdemo, Xvirtual,
Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, TCAT-PATH, T-
SCOPE and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To SUBSCRIBE to TTN-Online, to CANCEL a current subscription, to CHANGE
an address (a CANCEL and a SUBSCRIBE combined) or to submit or propose
an article, use the convenient Subscribe/Unsubscribe facility at
<http://www.soft.com/News/TTN-Online>.  Or, send E-mail to
"ttn@soft.com" as follows:

   TO SUBSCRIBE: Include in the body the phrase "subscribe {your-E-
   mail-address}".

   TO UNSUBSCRIBE: Include in the body the phrase "unsubscribe {your-E-
   mail-address}".

               TESTING TECHNIQUES NEWSLETTER
               Software Research, Inc.
               901 Minnesota Street
               San Francisco, CA  94107 USA  USA

               Phone:          +1 (415) 550-3020
               Toll Free:      +1 (800) 942-SOFT (USA Only)
               FAX:            +1 (415) 550-3030
               E-mail:         ttn@soft.com
               WWW:            <http://www.soft.com/News/TTN-Online>

                               ## End ##