sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======            August 2001              =======+

QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to Subscribers
worldwide to support the Software Research, Inc. (SR), TestWorks,
QualityLabs, and eValid user communities and other interested parties to
provide information of general use to the worldwide internet and
software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the entire
document/file is kept intact and this complete copyright notice appears
with it in all copies.  Information on how to subscribe or unsubscribe
is at the end of this issue.  (c) Copyright 2003 by Software Research,


                         Contents of This Issue

   o  QWE2001: 5th Internet and Software Quality Week Europe

   o  Testing Compatibility with IE6, by Tim Van Tongeren

   o  eValid -- An Integrated Suite of WebSite Test Tools

   o  Typical Browser Usage Statistics

   o  An Effective Guide for implementing Software Configuration
      Management, by Magesh M.

   o  Configuration Management for Distributed Development, by Nina

   o  QTN Article Submittal, Subscription Information


         QWE2001: 5th Internet and Software Quality Week Europe

                           Technical Program

                           T U T O R I A L S

         Monday, 12 November 2001, 9:00 - 12:00 -- AM Tutorials

Dr. Gualtierro Bazzana (ONION, S.P.A.)  "Web Testing Master Class (A1

Mr. Martin Pol, Mr. Ruud Teunissen (Polteq) "Stepwise Improvement of the
Test Process Using TPI (B1)"

Mr. Rex Black (RBCS, Inc.)  "Managing the Testing Process: Organization,
Motivation and Techniques (C1)"

Dr. Erik P. vanVeenendaal (Improve Quality Services BV) "Introduction to
the Test Maturity Model (D1)"

Mr. John McGregor (Korson-McGregor) "Testing Component-Based Systems

        Monday, 12 November 2001, 14:00 - 17:00 -- PM Tutorials

Dr. Gualtierro Bazzana (ONION, S.P.A.)  "Web Testing Master Class (A1

Mr. Bernard Homes (TESSCO Ltd.)  "Test Plan 101 Using IEEE 829 (B2) "

Mr. Ross Collard (Collard and Co.)  "Web Testing: A Practical Approach

Mr. Otto Vinter "Introduction to Defect Analysis (D2)"

Mr. John McGregor (Korson-McGregor) "Guided Inspections (E2)"

        Tuesday, 13 November 2001, 9:00 - 12:00 -- AM Tutorials

Mr. Erik Simmons (Intel Corporation) "Writing Good Requirements (F1 F2)"

Mr. Jens Pas (I2B) "Emotional Intelligence as the Key to Software
Testing (G1)"

Mr. Robert Sabourin ( "Getting Started - Stressing Web
Applications, Stress Early - Stress Often (H1)"

Mr. Ross Collard (Collard & Co.)  "Test Estimating (J1)"

Mr. Ibrahim K. El-Far (Florida Institute of Technology) "Model-Based
Testing (K1)"

        Tuesday, 13 November 2001, 14:00 - 17:00 -- PM Tutorials

Mr. Erik Simmons (Intel Corporation) "Writing Good Requirements (F1 F2)"

Mr. Olivier Denoo (ps_testware) "Think Green! Think Different! Think
Modular! (The LEGO Principle) (G2) "

Mr. Robert Sabourin ( "Just In Time Testing - Testing
Turbulent Web Based Applications (H2)"

Mr. Martin Pol & Ruud Teunissen (Polteq) "Structured Testing (J2)"

Dr. Rini vanSolingen (CMG (RTSE1)) "Practical Measurement of Software
Products and Processes: The Goal/Question/Metric Method (K2)"

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                   T E C H N I C A L   P R O G R A M

               Wednesday, 14 November 2001, 8:30 - 10:00

                           KEYNOTE SESSION #1

Dr. Linda Rosenberg (GSFC NASA) "Independent Verification And Validation
Implementation At NASA (QW2001 Best Presentation (K11)"

Ms. Elfriede Dustin (BNA Software) "Challenges Quality Web Systems

               Wednesday, 14 November 2001, 11:00 - 5:00
                       Parallel Technical Tracks


Jamal Said, Mr. Eric Steegmans (Department of Computer Science, K.U.
Leuven) "Validating Quality Requirements of Object Oriented Design (1T)"

Ms. Janet A. Gill, Dr. Frederick Ferguson (NAVAIR, Software Safety-
Critical Systems, Inc.)  "A Tool for the Design & Analysis of Software
Safety-Critical Systems (2T)"

Mr. Oliver Niese, Tiziana Margaria, Prof. Bernhard Steffen (METAFrame
Technologies GmbH) "Automated Functional Testing of Web-Based
Applications (3T)"

Ms. Nancy Eickelmann, Mr. Allan Willey (Motorola Labs) "An Integrated
System Test Environment (4T)"

Mr. Tobias Mayer (eValid, Inc.)  "InBrowser WebSite Testing: The
Client-Side Approach (5T)"


Mr. Jon M. Ibarra (SQS S.A., Spain) "QA Success Story for Embedded
Systems in Real Time Control Systems (1A)"

Dr. Rainer Stetter (ITQ GmbH & Software Factory GmbH) "Steps to Bring
the V-Model Into Real Life: A Case Study (2A)"

Mr. Ibrahim K. El-Far, Ms. Florence E. Mottay, Mr. Herbert H. Thompson
(Florida Institute of Technology) "Experience in Testing Pocket PC
Applications (3A)"

Dr. Ina Schieferdecker, Mang Li, Axel Rennoch, Dorota Witaszek (GMD
FOKUS) "Testing of CORBA Products (4A) "

Mr. Jerrold Landau (IBM Canada) "A Case Study -- Testing Ramifications
of Product Suite Evolution from GUIs to Components (5A)"


Ms. Julia G. Rodriguez & Luis O. Santos (Extremadura University)
"Providing Automated Support for Web Metrics (1I)"

Mr. Kim Davis (My Virtual Model, Inc.)  "Practical Experiences in Bug
Cluster Management (2I)"

Ms. Nancy Landau (Alltel Technology Services) "Performance Testing
Applications In Internet Time (3I)"

Mr. Raymond Rivest (Computer Research Institute of Montreal) "Challenges
of Automating Performance Tests for New Internet Technologies (4I)"

Mr. Andy Crosby (Mercury Interactive UK) "Testing Inside and Outside the
Firewall (5I)"


Mr. Stale Amland, Martin Pol (Amland Consulting) "Test Process
Improvement -- Theory and Practice (1M)"

Mr. Tor Stalhane (NTNU) "Improving the Software Estimation Process (2M)"

Mr. Michael J. Hillelsohn (Software Performance Systems, Inc.)
"Organizational Performance Engineering: Proactive Quality Assurance for
the Internet Age (3M)"

Mr. Ton Dekkers (IQUIP Informatica BV) "Quality Radar: Getting Grip on
Customer Expectations (4M)"

Mr. Geert Pinxten & Jens Pas (I2B) "Variable Test Strategy: Learn To
Only Do What You Need To Do (5M)"

               Thursday, 15 November 2001, 8:30 - 10:00

                           KEYNOTE SESSION #1

Mr. Rik Nuytten (Cisco Systems Belgium) "Building the Infrastructure for
The Future (K21)"

Mr. Bob Bartlett (SIM Group) "Power Testing (K22)"

               Thursday, 15 November 2001, 11:00 - 5:00
                       Parallel Technical Tracks


Dr. Stacy Prowell (The University of Tennessee) "Tool Support for Model
Based Statistical Testing (6T)"

Ms. Mihaela Barreau, Mr. Jean-Yves Morel, Alexis Todoskoff (University
of Angers) "State-of-the Art Information on Petri Nets Applied to
Software Quality (7T)"

Dr. Christian Bunse, Dr. Oliver Laitenberger (Fraunhofer Institute for
Experimental Software Engineering) "Improving Component Quality Through
the Systematic Combination of Construction and Analysis (8T)"

Mr. Anders Claesson (Enea Realtime AB) "How to Use Scientific Methods In
Software Testing (9T)"

Mr. Francisco Vega, Ms. Begona Laibarra (Alcatel SEL, SQS SA)
"Systematic Validation of an Interlocking System (10T)"


Chris C. Schotanus (CMG Finance BV) "Structuring Your Tests in a
Component Based Environment (6A)"

Ms. Nadine Pelicaen (ps_testware) "Performance Testing: "Step On It"

Mr. Daniel Blezek, Timothy Kelliher, William Lorensen, James Miller (GE
Corporate R&D) "The Frost Extreme Testing Framework (8A) "

Mr. Leo VanDerAalst "Testing Challenges of Incremental Component Based
Development (9A)"

Mr. Erik Simmons (Intel Corporation) "Product Triage: A Medical Approach
to Predicting and Monitoring Product (10A)"


Mr. Paul McBride (VeriTest) "Deployment of Globalised Wireless Internet
Applications (6I)"

Dr. Edward F. Miller (eValid) "Innovative Website Mapping Tool (7I)"

Mr. Scott Jefferies (Starbase Corporation) "A Requirements-Based
Approach to Delivering E-Business and Enterprise (8I)"

Mr. Robert A. Martin (The MITRE Corporation) "Vulnerabilities and
Developing for the Net (9I)"

Mr. Simon J. Hardiman (SQS S.A., Spain) "Multifaceted Internet
Application Quality Validation Methodology (10I) "


Mr. Michael J. Hillelsohn (Software Performance Systems) "Using SPICE as
an Internal Software Engineering Process Improvement Tool (6M)"

Mr. Dean Hanley (Computer Associates) "Process Management Maturity (7M)"

Mr. Anders Claesson (Enea Realtime AB) "A Risk Based Testing Process

Mr. Rob Baarda (IQUIP Informatica BV) "Risk Based Test Strategy (9M)"

Dr. Robert Darimont, Ms. Emmanuelle Delor, Mr. Andre Rifaut (CEDITI)
"Quality Starts by Defining Goals (10M)"

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                Friday, 16 November 2001, 9:00 - 11:00
                       Parallel Technical Tracks


Mr. Eric Patel, Mr. Jim Bampos (Nokia & Lionbridge) "Virtual Test
Management: Rapid Testing Over Multiple Time Zone (11T)"

Dr. Peter Liggesmeyer "Condition Coverage Techniques in Theory and
Practice (12T)"


Mr. Kie Liang Tan (CMG TestFrame, Software Testing and Test Centre)
"Configuration Management in Test Centers (11A)"

Sridhar Narayanan (Cognizant Technology Solutions) "Metrics Cockpit
Means of Viewing (12A)"


Dr. Klaus Quibeldey-Cirkel (TLC GmbH) "Checklist for Web Site Quality
Assurance (11I)"

Mr. Eric Patel (Nokia) "Rapid SQA: Web Testing at the Speed of the
Internet (12I)"


Mr. Ruud Teunissen (Gitek NV) "The Art of Managing Fixed Price Test
Project (11M)"

Per Kroll (Rational Software) "Survival Guide for Applying a Software
Development Process (12M) "

                Friday, 16 November 2001, 11:00 - 12:00

                          KEYNOTE SESSION #3

Mr. Erik Simmons (Intel Corporation) "Requirements to Release Criteria

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                     QWE2001 ADVISORY BOARD MEMBERS

             Selim Aissi (USA) - Gualtiero Bazzana (Italy)
             Boris Beizer (USA) - Antonia Bertolino (Italy)
                Juris Borzovs (Latvia) - Rita Bral (USA)
           Sylvia Daiqui (Germany) - Olivier Denoo (Belgium)
                Tom Drake (USA) - Bernard Homes (Canada)
           Andre Kok (Netherlands) - Monique Legault (France)
          Franco Martinig (Switzerland) - Edward Miller (USA)
             Michael O'Duffy (Ireland) - Jens Pas (Belgium)
           Emilia Peciola (Sweden) - Martin Pol (Netherlands)
            Raymond Rivest (Canada) - Linda Rosenberg (USA)
           Rob Sabourin (Canada) - Henk Sanders (Netherlands)
          Torbjorn Skramstad (Norway) - Harry Sneed (Germany)
         Bernhard Steffen (Germany) - Stephan Steurs (Belgium)
      Ruud Teunissen (Belgium) - Rob VanderPouwKraan (Netherlands)
                         Otto Vinter (Denmark)

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

            R E G I S T R A T I O N   I N F O R M A T I O N

Complete registration with full information about the conference is
available on the WWW at either of these URLs:


where you can register on-line.

We will be pleased to send you a QWE2001 registration package by E-mail,
postal mail or FAX on request.  Send your E-mail requests to:


or FAX or phone your request to SR/Institute at the numbers below.

          QWE2001: 12-16 November 2001, Brussels, Belgium  EU

| Quality Week Europe Registration  | Phone:       [+1] (415) 861-2800 |
| SR/Institute, Inc.                | TollFree (USA):   1-800-942-SOFT |
| 1663 Mission Street, Suite 400    | FAX:         [+1] (415) 861-9801 |
| San Francisco, CA 94103 USA       | E-Mail:     |
|                                   | Web: |


                     Testing compatibility with IE6
                          by Tim Van Tongeren

With Microsoft's public beta release of Internet Explorer 6.0, users are
already starting to hit web servers with the new browser version. While
developers have probably not incorporated any new functionality in
applications, testers will want to make sure that the existing web
application is compatible with IE6. This new version of IE is expected
to ship with the Windows XP operating system in Fall 2001.

                        SETTING UP A TEST CLIENT

When setting up a test client with IE6, the minimum requirements are a
486MHz processor with 16MB of RAM and 153MB of hard drive for Win98,
32MB of RAM and 134MB of hard drive for WinME, 32MB of RAM and 123MB of
hard drive for WinNT4.0 Service Pack 6.0a, or 32MB of RAM and 134MB of
hard drive for Win2000. That's right, kids - IE6 does not support Win95
or any NT Service Pack lower than 6.0a.

Download Internet Explorer 6 Public Beta:

                         USER INTERFACE CHANGES

The main changes in the user interface that could affect your web
application are image resizing, the image toolbar and the
"MSThemeCompatible" META tag.

Version 6 of the IE browser automatically resizes images that are too
large for the browser window. Next to the image the browser will add a
button for the user to resize the picture. Pages should be tested to
ensure that the browser still formats everything properly. There may be
formatting changes when image sizes are decreased, when the button is
added to manage the image size or when the browser is manually resized.

Another change with graphics is that the browser will display an image
toolbar when the user floats the mouse over an image. Developers can
turn this off with a META tag, however existing sites will not have this
tag, and will display the toolbar by default.  Sites that use several
images -  for example a menu on the side of the page with an image for
each button - might get quite annoying to use if the image toolbar is
not turned off.

Using the "MSThemeCompatible" META tag, the web application can follow
the same style as the user's desktop, however, this option defaults to
No. Your current web applications will not change, unless the META tag
is added. If the META tag is added and set to Yes, the tester should
verify that the site it still usable. If you are testing a Federal web
application that falls under Section 508 of the Rehabilitation Act, you
may want to raise this as an issue since the page could be less usable
for color-blind users.

                             COOKIE CHANGES

IE6 requires web servers to give more information when working with
cookies. Following W3C guidelines for P3P, the browser requires that the
web server has a privacy profile called p3p.xml on the root directory of
the server. If the web server does not have this file, the default
settings in the browser will cause the cookie to be rejected.

Additionally, the browser distinguishes between first party and third
party cookies. A first party cookie is a cookie from the domain the user
is viewing, while a third party cookie is from a different domain. Third
party cookies may be sent by message board applications, counters,
shopping carts or other applications that are outsourced in the web

The user is also given more choices for cookie management, like
acceptance per domain, acceptance per party type and leashing. Leashing
a cookie means that the 3rd party cookie will only be accessible through
the 1st party. Legacy cookies (cookies stored by previous versions of
IE) are defaulted to leashed.

Web testers should verify that web applications work properly with the
new cookie functionality. Any application using cookies should ensure
that their server has a privacy profile on the server. (The privacy
profile will also need to be added to the test server.) Web testers may
find problems with logins and integration with third party applications.
Testers should also verify that the web application can accept the
cookie-related transactions back from the client.

                            PLUG-IN CHANGES

With IE6, the browser changed its support for plug-ins. All extra
functionality will be run through Active-X. (This change was also
recently added to IE 5.5 SP2.) This is especially of interest to testers
working on plug-in applications. For example, Apple had to release an
Active-X control to enable Quicktime to be used in IE6. If your web
application is a plug-in you will need to verify that it does not also
need to be changed.

This could also affect web applications that use plug-in technology.
Sticking with the Quicktime example, pages that currently have embedded
Quicktime content on their page will probably need to change the EMBED
command to an OBJECT. More details can be found at Apple's website:

Testers should verify that any other plug-in content on their web
application is still accessible using IE6. If plug-in content does not
work, development may be necessary.


As you can see, IE6 could operate differently for some web applications.
New versions of browsers (Netscape) and Operating Systems (Windows XP)
are coming soon, as well. The job of the web tester is never done; with
updates to browsers we must constantly keep up to date on changes to the
user experience.


Fernicola, Pablo. "Highlights of Internet Explorer 6 Public Preview."
MSDN webpage.

Hansen, Evan. "IE upgrade cuts off QuickTime." CNet, Aug 15, 2001.

Goldfeder, Aaron. "Internet Explorer 6 Privacy Features." Online
presentation at


          eValid -- A Comprehensive WebSite Test Environment

One of the main criteria used in building eValid was to have a system
focused very narrowly and very specifically on all aspects of WebSite
testing.  eValid is intended to enable you to do everything you need to
do for client-side analysis, mapping, testing, validation, tuning, and
loading of WebSites.

The eValid WebSite Test Environment Solution available today (for
Windows NT/2000) includes:

  * Operation entirely inside a fully functioning IE-equivalent browser.
  * Complete site mapping and dynamic filtering from a spider within the
  * Complete functional & regression testing support with advanced
    object-oriented validation modes.
  * LoadTest operation featuring automatically launched multiple
    independent browsers for totally realistic loads.
  * Page timing and tuning from within the browser to 1.0 msec
  * Integrated Java Applet test coverage analysis (with TCAT/Java).
  * Automatic script creation and generation capability (test data
  * Complete facilities for monitoring and measuring WebSites constantly
    and automatically.
  * Test suite management facilities.
  * FREE Page Metrics PopUp to measure individual page statistics.

eValid is one tool family, with one easy-to-use interface, one focus,
one supplier, with complete interoperability among its multiple

Try out eValid and see for yourself! You can download an instant 3-day
Auto-EVAL license from:

A complete description of eValid Ver. 3.0 is at:

Complete information on eValid is at:
  <> or <>.

Or write direct for a EVAL license to .

| License Generation Group          | Phone:       [+1] (415) 861-2800 |
| eValid, Inc.                      | Tollfree (USA):   1-800-942-SOFT |
| 1663 Mission Street, Suite 400    | FAX:         [+1] (415) 861-9801 |
| San Francisco, CA 94103 USA       | Email: |
|                                   | WWW: |


                    Typical Browser Usage Statistics

Since we can't all afford expensive reports on the latest browser
statistics, there are some sites that offer their stats for free like
the University of Illinois.  The actual reference is:

            Data for: Sunday 26 Aug 2001

            Browser Platforms                Hosts     %
              1. Windows                       4803  70.5
              2. Windows 2K (NT5)               867  12.7
              3. Macintosh                      276   4.0
              4. Windows NT                     255   3.7

            Browser Flavors                  Hosts     %
              1. Microsoft                     5502  80.7
              2. Netscape                      1022  15.0
              3. other                          293   4.3

            Netscape Versions
            Version                           Hosts     %
              1. v4.5+                          794  77.7
              2. v4                              90   8.8

            Microsoft Explorer Versions
            Version                           Hosts     %
              1. v5                            4967  90.3
              2. v4                             325   5.9

"These are browsers used by 6817 hosts in 57 countries making 75062
accesses on 26/Aug/2001 to the Engineering Workstations WWW server at
the University of Illinois at Urbana-Champaign. The lion's share of
accesses to this server are for engineering student home pages."


          Configuration Management for Distributed Development
                           By Nina Rajkumar.

Think Business Networks Pvt. Ltd., July 2001 All rights reserved.  You
may make one attributed copy of this material for your own personal use.
For additional information or assistance please contact Nina at

Configuration  Management for Distributed Development Software
Configuration Management Introduction Configuration Management (CM) is
an "umbrella" activity that is applied throughout the Software
Engineering process.  Configuration management (CM) includes
synchronizing and supporting developers in their common development and
maintenance of a system.  In order to utilize skilled personnel despite
geographical location, groups of developers are now working all over the
world on the development of common systems, a situation called
distributed development.  This paper presents different cases and
architectures with respect to distributed development and their demands
on Configuration Management Tools.  This paper also presents the
features of some of currently available CM tools that support
Distributed Development.

                 Configuration Management: The Concept

Configuration Management (CM) is a discipline within software
engineering with the aim to control and manage projects and to help
developers synchronize their work with each other.  This is obtained by
defining methods and processes to obey, making plans to follow and by
using CM tools that help developers and project leaders with their daily
work.  CM has, during the past years, been considered more and more
important due to several reasons.  One reason is the influence of the
well known SEI Capability Maturity Model (CMM), which has pointed out CM
as an important ("1key process") area to achieve level 2 on its 5 level
scale.  Another important reason is the fact that software is getting
larger and more complex and needs the support from CM.  Strategic
Planning is a process to guide the members to envision the future and
develop the necessary procedures and operations to achieve that future.

                  Distributed Development: A Scenario

A significant segment of today's software industry is moving toward a
model of project organization that involves the use of multiple
engineers at multiple sites working on a single software system or set
of highly interdependent software systems.  An example of this kind is
trend that developers are getting more dispersed, although still working
on the same system.  CM Tools were originally developed earlier under
the assumption that the people as well as the files are situated at the
same geographical location.  When this assumption no longer is true it
creates new demands and tools and processes needs to be reevaluated.
Some aspects, such as file access, have been partly solved by supporting
server replication of the files.  Other aspects, such as the creation of
a general picture and a context, and the communication between
developers and groups, have remained manual and without direct tool
support.  When people working closely together are geographically
dispersed, we need to consider how these additional aspects may be
supported in the work method and in tool support.

Irrespective of reason many companies have found that the methods and
tools used do not fully support their current situation, and that it
will be even worse in the near future.  Therefore they now intend to
develop such support.  We can also see a trend where tool vendors focus
more on support for distributed development and for teams in general.
Regardless of the Team size and geographical distribution, software
development teams need a powerful software solution that allows them to
work together efficiently and productively.

                    Cases of Distributed Development

The different cases that have been identified are:

   > Distance working.
   > Outsourcing
   > Co-located groups
   > Distributed groups

The above cases occur individually or in combinations.  For instance
there may be groups which are normally connected but which may
occasionally be distributed.  Distance Working

This kind of distant work is brief work being performed elsewhere than
the usual place of work.  For example working from home.  When
developers work at home (or elsewhere) on a more regular basis or for
longer periods of time, a situation similar to that for distributed
groups arises.  A limited computer utility and a relatively slow means
of communication with the world around is characteristic of distance
working.  Despite this, there is a desire to be able to start working
quickly, as the total working time on each occasion is short (typically
a few hours in the evening), which means that it must be possible to set
up the working environment quickly.  Remote login to the place of work
and the home computer is being used as a terminal.  In this case of
distributed development, the developers work should be in sync with the
work done by other people on the same files.  This completely relies on
the CM tool used.


Instead of developing everything by yourself or buying existing
components, you can engage a third party to develop them for you.  This
is usually called outsourcing and gives, a greater control of the
development of the component, albeit at a higher price.

1. Outsourcing is based on a close collaboration between the supplier
   and the purchaser.

2. Consequently it is often possible for the supplier to test the
   component in an environment similar to the target environment prior
   to delivery.  The purchaser then usually provides the test

3. The purchaser is ultimately responsible for the product and possible
   error/change management can be reflected in changed demands on the
   component towards the supplier.

4. As with any order, it must be clear what should be delivered, but in
   this case it is further complicated by the fact that the demands as
   well as the environment may change.  In this case of distributed
   development, the purchaser must be able to integrate new versions of
   the component into the product, which itself may have developed since
   the latest release of the component.

5. The supplier should be able to manage the updating of the

                           Co-located Groups

Developers at different affiliated companies usually belong to local
groups or projects.  The division of the work has already been
determined at the structuring of the project/product to prevent too much
dependency between the different groups.  The product is divided up into
sub-products, which can be developed by different project groups.  This
division makes it possible to do most of the development locally within
the groups without the requirement of much communication with other
groups.  Within the group and between groups in the same place, the
situation is the same as with local development.  Groups in different
places normally only have access to the latest stable versions produced
by the other groups.  Due to the geographical distance, potential
problems will inevitably be more difficult to solve. Therefore, updating
and distribution between the groups requires more effort and
administration, these may be considered as internal deliveries and
therefore tend to come more infrequently.  Cooperation between the
groups may be facilitated if the work is planned in phases of which
everyone is aware.  In this case, when the locations are permanent, each
local group should be able to work within a complete development
environment and with the possibility of testing and added to this change
management of common components, such as interfaces, is of particular

                           Distributed Groups

Distributed groups with members at different locations means that the
members of the group are also distributed, i.e. that the people working
in the same project, perhaps even in the same files, are geographically
dispersed.  The possibility of daily communication by formal as well as
informal meetings is lost.  Projects working towards the same product
usually use some common libraries or components.  Changes in these are
unusual, but sometimes inevitable.  If group members at different places
want to make changes simultaneously, they face a situation similar to
that for the updating of interfaces where there are "connected groups"
but in this case the problems apply to all files.  The obvious example
is when people included in one group, have to travel around to other
groups for various reasons.  Of course there is a desire to be able to
continue working with the usual project, this will then be done as a
distributed group.  A similar situation arises when staff is moved to
new projects but often need to be consulted on the old project.

From a CM perspective, it is important that the members of the group
receive information about what the others in the group are doing, how
the project is developing, its status, which changes have been done and
by whom etc.  It is important to support the division of files and
concurrent, simultaneous changes.  Solutions using "locking" and
exclusive access to files work poorly, as it is difficult to solve
situations where group members, located at different sites, must wait
for each other.


We have seen the different cases of distributed development.  To meet
the demands arising from these different situations, we can locate
workstations and repository servers in these places in different
architectures.  In this case a geographic place equipped with
repository/server is called a "site".  Developers with workstations but
without a repository/server are therefore not a site.  Some CM tools for
Distributed Development are based on these architectures.

The different architectures being discussed are:

  > Remote login.
  > Several sites by Master-Slave connections.
  > Several sites with differing areas of responsibility.
  > Several sites with equal servers

                              Remote Login

Simply put, everyone logs into and uses a single server.  Those situated
at locations other than where the server is located, log into the server
by remote login, telnet, or other similar protocols.  Technically a
developer then works as if situated locally but is limited by a slower
(and possibly a less reliable) connection, for instance over a modem or
Internet, the same way a laptop can be remotely connected to the server.
Some of the product=92s files are copied to be then worked on locally.
The CM tool should support updating and synchronization of the files.

               Several sites by Master-Slave connections

A version of a connected sub-system is copied from a master to another
(slave) server where it is further developed.  The CM tool supporting
this architecture eliminates complicated merge situations.  A situation
like this may occur with out-sourcing for instance.

          Several Sites with differing areas of responsibility

Different sites are responsible for different sub-systems.  The division
can be based on the responsibility for certain files.  The variant
concept must be the same for all of the files on the servers.  For those
parts that a site is not responsible for, the information can only be
read.  Synchronization is achieved by the changes in the original being
transferred to the copy.  It depends on the protocols supported by the
CM tool.  Synchronization is often done automatically and at close
intervals.  It should be noted that a site could have the original of
one sub-system and at the same time have copies of others.  This means
that updating can occur in both directions between servers holding the
originals for different sub-systems.  Compared to the master-slave
architecture, this is a more permanent division and the synchronization
is usually done automatically and therefore more frequently.

                    Several Sites with Equal Servers

This is an architecture where several equal servers are located at
different sites.  These are automatically synchronized at close
intervals (hours, minutes, and seconds) and all of the servers have
(with very little delay) the same information.  The result is that a
developer can work at any site (towards the server at that site) without
noticing a difference.


The architectures discussed in this paper can serve as a guide when
planning the introduction of distributed development, and as a basis for
analyzing the consequences and limitations of different solutions.  It
will never be very easy to use the architecture until there is complete
tool support.  It is concluded that some CM tools of today support
distributed groups extensively, but there is scope to develop CM Tools
which enables a highly interactive distributed development.


Software Engineering, "A Practitioners Approach", Roger S. Pressman

A Generic, Peer-to-Peer Repository for distributed Configuration
Management, Andre van der Hock, Dennis and Alexander L. Wolf.

Configuration Management for Distributed Development, Lund Institute of

Change Sets Revisited and Configuration Management of Complex Documents
by Stephen A. MacKay.

Successfully Managing the Complexities of Today's Distributed
Development, by Continuus Software Corporation


 An Effective Guide for Implementing Software Configuration Management

                              By Magesh M.
                Think Business Networks Pvt. Ltd. INDIA

            Team Based Development for a Small Organization
                   Software Configuration Management

Whatever be the size of a software organization, the goal is and will
always be a good reliable, bug free, well-performing product complying
with the requirements of the customers. A bigger organization achieves
this by having a separate department for configuration management other
than the Quality Assurance team to take care that the product is well
managed and controlled. A project should be controlled and managed at
any point to ensure quality. However in smaller organizations having
such a separate department will be impossible. The quality engineer also
needs to play the role of a configuration manager. Under such situations
how does one come up with a suitable configuration management system?
This article explains this and introduces the concept of team
development and explains how a project can be kept under control and
thus achieves a team development.

              What is Team Development? -- The Definition

The term Software configuration management is a mechanism employed by
tools that manage software application. Team development indicates the
end result these tools along with the defect tracking tools should help
us accomplish.  A product cannot be completely managed and controlled by
just controlling the software versions and configurations. What is
needed is a way of managing the complexity of software projects while
empowering team members to work effectively together.  In a nutshell, a
team development solution should:

      >> Track changes to all project components
      >> Support parallel development (branches, differences,
              mergers, distributed teams)
      >> Control the entire project and its evolution in time
              (releases and variants)
      >> Manage the approval of changes (promotion process)
      >> Manage builds and dependencies

       Software Configuration Management in Small Organizations
                            What do we lack?

A small organization will have in place a version control tool which
will most probably be of the first generation i.e. a file based version
management tool. It might also have a defect-tracking tool. The quality
assurance group will be playing the roles of a build/installation
engineer, configuration engineer, test engineer and a process engineer.
They might not have in place a process defined for configuration
management and if at all it exists, it will not be base-lined. They will
also not have the metrics to establish the success of the process. To
put in other words they lack historic data that is very essential to
prove the effectiveness of a process.

                             How to Improve

The building blocks of functionality that should comprise team
development solutions are

>>  Version Control
>>  Workspace and Release Management
>>  Build Management
>>  Process Management
>>  User Interfaces
>>  Architecture
>>  Integrations and Tool Interfaces

Although all these features may look reasonable and necessary, no single
tool includes them all. However these shortcomings can be overcome by
making a brief study and survey of the environment and requirements of
the organization in coming up with the products.  Given above is a pool
of information that can be matched against the requirements to pick out
which is relevant to us. For example, having a first generation version
tool might seem a setback to the configuration management at first but
on analyzing the situation there might not be a requirement for a
distributed development, the main issue addressed by second and third
generation tools.  The other functionalities like workspace insulation,
build management can be effectively followed by defining proper
procedures taking advantage of the small size of the organization.

Process should be defined to manage change request and metrics should be
identified to measure the outcome of the project.  A process should also
be established to collect data against these metrics.

            Managing Version Control -- The Codeline Policy

Define a codeline policy that specifies the use and check-ins of the
codeline.  A codeline can be seen as a canonical set of source files
needed to produce the product.  Examples of codelines can be development
codelines, release codelines etc.  A release codeline policy might
specify that only bug fixes can be the check ins.  In addition the
developer can be provided with a private repository where the developer
can check in the intermediate versions before checking them into the
repository.  By doing so the developer can be provided with a mechanism
for check pointing changes at granularity they are comfortable with.
The repository might not even be a separate project repository in the
version control repository but can be as simple as a folder in the
developer's machine.  At the same time periodic integration of the
developer's work with that of others is very important to maintain

              Workspace Management: Check in Periodically

As the user is provided with a separate repository for working on the
files he is provided with a insulated workspace.  Again as he checks in
his files periodically the problem of workspace isolation is avoided.
This is an important functionality provided by third generation version
management tools.  This functionality can be achieved using the first
generation version management tool itself by following the procedure of
checking in periodically.

                    Build Management: Avoid Big Bang

The only ingredients of the build should be the source codes and the
tools to which they are input.  Routine setup procedures can be
automated to avoid human errors.  A checklist can be maintained to check
for the ingredients to completely build the software.  Build outputs and
logs should be achieved; this will be very useful when problems are
encountered with new builds.  The developers should be provided with a
mechanism to build the system periodically to avoid unneeded surprises.
Developers should be discouraged from maintaining long intervals between
check-ins.  This periodic build should be checked for interface
compatibility (does it compile?) and testing (Does it still work?)
Developers should be encouraged to build from files that are likely to
be in the release (perhaps the newest code in the revision control
system's trunk) to anticipate, and allow time to correct for,
incompatibilities.  The goal is to avoid a Big Bang.

                         Other Functionalities

Software configuration management does not manage the software product
codes alone. It has to provide support and management for the process
established to track the projects. In fact, equal priority should be
given to the process artifacts and the subsequent changes to these
artifacts need to be controlled.  A baseline should be established for
the artifacts and the changes that are done to these artifacts to tailor
them to the organization needs should be tracked. Here a file-based
repository fulfills the requirements and no sophisticated tool is
necessary. Team development is not a stand-alone function. A good
solution must work well with the other software development and
maintenance tools that an organization employs. Hence policies should be
established for maintaining the interface with other tools such as
defect tracking.


The process of implementing Configuration Management routines and
procedures might seem straight-forward but are very complex and time
consuming. Continuous refining of the process and tailoring it to the
organization is an iterative process.

Again identification of possible data to quantify improvement is
difficult. Metrics needs to be defined from the existing information.
The customer feedback should be taken as an important metric while
tracking the effectiveness of the process in establishing the
Configuration Management. Training needs to be provided at all levels
and the role that an individual plays with respect to the system should
be clearly specified. Thus, continuous tracking and refinement of the
configuration management system can establish a good team development in
small organization.


1. Team development Overview, by Alex Lobba

2. Configuration Management Patterns by Steve Berczuk

3. High level best practices in Software Configuration Management, by
Laura Wingerd and Chris toper Seiwald, Perforce Software, Inc

4. Change Management for Software development by Continuous

5. White papers from SCM Labs, Inc

6. The business benefits of software best practices -- case studies

7. Introduction of Software Configuration Management in very small
organizations - ICONMAN

8. Introduction to Configuration Management -- Gaining a competitive

      ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 9000 subscribers
worldwide.  To have your event listed in an upcoming issue E-mail a
complete description and full details of your Call for Papers or Call
for Participation to .

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the QTN issue date.  For example,
  submission deadlines for "Calls for Papers" in the March issue of QTN
  On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the opinions
of their authors or submitters; QTN disclaims any responsibility for
their content.

STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR logo are
trademarks or registered trademarks of Software Research, Inc. All other
systems are either trademarks or registered trademarks of their
respective companies.

          -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to CHANGE an
address (an UNSUBSCRIBE and a SUBSCRIBE combined) please use the
convenient Subscribe/Unsubscribe facility at:


As a backup you may send Email direct to  as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:

   TO UNSUBSCRIBE: Include this phrase in the body of your message:

Please, when using either method to subscribe or unsubscribe, type the
 exactly and completely.  Requests to unsubscribe that do
not match an email address on the subscriber list are ignored.

	       Software Research, Inc.
	       1663 Mission Street, Suite 400
	       San Francisco, CA  94103  USA
	       Phone:     +1 (415) 861-2800
	       Toll Free: +1 (800) 942-SOFT (USA Only)
	       Fax:       +1 (415) 861-9801
	       Web:       <>