sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======           September 2000            =======+

QUALITY TECHNIQUES NEWSLETTER (QTN) (Previously Testing Techniques
Newsletter) is E-mailed monthly to subscribers worldwide to support the
Software Research, Inc. (SR), TestWorks, QualityLabs, and eValid WebTest
Services user community and to provide information of general use to the
worldwide software and internet quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the entire
document/file is kept intact and this complete copyright notice appears
with it in all copies.  (c) Copyright 2003 by Software Research, Inc.


   o  4th Annual International Software & Internet Quality Week Europe:
      Conference Theme: Initiatives For The Future.

   o  A Theory of Component Integration Testing, by Evan Thomas

   o  Outsourcing WebSite Testing, by Derek Sisson

   o  New Denial-Of-Service Tool Looms, by George V. Hulme

   o  Testing of Object-Oriented Software: Life Cycle, by Imran Bashir
      and Amrit L. Goel

   o  6th International Conference on Reliable Software Technologies
      (Ada Europe 2001), Call For Papers

   o  Builds Available for TCAT/Java, TCAT/C-C++

   o  Skills of a Good Tester, by Alan Braithwaite

   o  SERG Report: Organizing and Documenting Component-Oriented
      Toolkits, by Mohammad Radaideh

   o  QTN Article Submittal, Subscription Information


    4th Annual International Software & Internet Quality Week Europe
              Conference Theme: Initiatives For The Future

                          20-24 November 2000
                         Brussels, Belgium  EU


The complete program for the 4th International Software & Internet
Quality Week Conference [QWE2000] (Theme: Initiatives For The Future) to
be held 20-24 November 2000 in Brussels, Belgium EU, is now available


The QWE2000 International Advisory Board has assembled a terrific
international team of over 60 speakers.  Discover the state-of-the-art
in software and internet QA and testing from around the world:  from the
USA to the UK, from Canada to Brazil, from Europe to China!

Our goal with QWE2000 is to bring together industry and academic,
software and internet oriented, European and non-European specialists.

The QWE2000 Program features:

* Pressing questions and issues discussed by a distinguished lineup of
  Industrial Keynote Speakers:

   - Tom Gilb (Results Planning) The Ten Most Powerful Principals
     for Quality in Software Organizations
   - Jens Pas (I2B) Test Out-Sourcing: From Necessary Evil to E-
     Competitive Advantage
   - Lisa Crispin (iFactor-e) Stranger in a Strange Land: Bringing
     QA to a Web Startup
   - Hans Buwalda (CMG Finance) Soap Opera Testing
   - Tom Drake (ICCI) The Future of Software Quality - Our Brave New
     World-Are We Ready?

* 12 pre-conference Tutorials conducted by the foremost experts in their

* Five Parallel Tracks that cover the broad field of software quality
  with the latest developments:

   - Technology: From browser-based website testing to UML methods
   - Internet: E-commerce experience, Internet Time and Site
   - Applications: Hear solutions from researchers and practitioners
   - Management: Managing Testing, Quality Improvement, Process
   - Vendor Technical Presentations allow you to broaden your tools
     and services information.

* For The First Time Ever... The Information Systems Examination Board
  (ISEB) of the British Computer Society has accredited the standard
  full course for delegates at Quality Week Europe with some experience
  in testing, who wish to take the examination leading to the Foundation
  Certificate in Software Testing.  The course will take 18 hours
  including the "closed book" exam which consists of forty multiple-
  choice questions.  The exam will be offered during the conference
  supervised by an ISEB proctor.  The exam results and ISEB certificates
  will be presented at a special announcement during QWE2000.

* Industry Exhibitors who will showcase their services and latest
  products at the Two-Day Trade Show (Expo: 22-23 November 2000).
  Exhibitors include: Amphora Quality Technologies, CMG, eValid, Gitek,
  I2B, McCabe Associates, NexusWorld, PolySpace, ps_testware, RadView,
  Rational, Software Emancipation, SIM Group, TBI, and more.

* Special Events: Attendees will tour two small family-run factories to
  sample Belgium's most famous products:  Belgian Chocolate and Belgian
  Beer.  Can you argue with that!

Mark your calendars *NOW* for QWE2000, in beautiful downtown Brussels,
20-24 November 2000.  Register early on-line and receive Early Bird
Special Pricing for the Quality Week Europe 2000 Conference.


               A Theory of Component Integration Testing
                             by Evan Thomas

Component Integration Testing is often misunderstood. Is it Unit
testing?  How is Component Integration Testing different from System
Integration Testing? The Theory of Component Integration Testing is
still controversial. "Where did the software come from?" is a question
that is often asked. Opinions differ. This essay will attempt to expose
the real reality of where Components come from and how CIT functions.

Component Integration is a place of great beginnings. All of the designs
that the Great Developer, the Mighty Sales Representative, the Wise
Analyst and the Humble Client created are thrown together for the first
time and melted into this huge cauldron known as Component Integration.
Component Integration is spun around at a very high temperature in tiny
silicon chips. Then it happened. Thousands of nanoseconds after it's
creation, CIT arose from the primordial GUI. This evolutionary mass is
now shaped and formed into what only a user could love. This product of
evolution, this "code" is now fashioned to allow trained professional
testing specialists access to the newly created Component Integration

This is where all the mighty plans of the great Component Integration
Testing (CIT) planners come to bear. CIT personnel spent millions and
billions of nanoseconds planning for this very specific moment. The
minds that make up the CIT collective will test to determine whether or
not the modules of each pre-historic matter (Unit/Component) can
function together to allow synergistic growth of the application.

Many times, tiny imperfect butterflies, which could blossom into huge
hurricanes on the other side of the server, are identified. This is a
complicated process that involves two methodologies. First, the CIT
collective tests the new and improved evolutionary enhanced functions.
Second, in a Pilgrim's Regression, the CIT collective reverts back down
to the older, deeper functions to insure their continued viability.The
goal is to graduate from this world's domain to Dante's next level of
migration: System Integration.

So what is next for CIT? Many more enhancements are planned. Grand
schemes that show the bold vision that is CIT. One day, no longer will
regression testing be executed by mere mortals. The great silicon gods
will themselves descend to the software plane. In the form of the mighty
automated tools (aka Titans) the silicon gods will take it upon
themselves to run the regression that is so troublesome and time
consuming to man kind.

This is the Theory of Component Integration Testing as it is understood
today by modern science.


                      Outsourcing WebSite Testing


                              Derek Sisson
                           (Philosophe, Inc.)

A company might consider outsourcing a website's testing for several

   *  The company may be launching a new site, or a new version of a
      site, and may feel that most testing tasks can be relegated to a
   *  The company may not have the resources -- people, skills,
      software, hardware or time -- to perform testing themselves.
   *  The project to be tested may be of such as short life span that
      the company doesn't need any long-term investment in testing
      processes for the project.
   *  The company might want an independent third party to perform the
      testing, in order to get a more objective view of site quality.
   *  The company may even be outsourcing the development and coding for
      the site, making the outsourcing of the testing a reasonable
      decision. (Even more reasonable would be a firm that provides the
      coding and the quality control for its own code.)

The decision to outsource testing needs to be a well-considered
decision, because most people who aren't responsible for the testing
misunderstand the meanings and scopes of the concepts involved in
testing. Two very important issues must be resolved before taking the
outsource step: first, the company that owns the website must be
absolutely clear about the scope of the job -- including the tasks,
processes, and responsibilities -- that they want to hire out; and
second, the company must be sure that they are speaking the same
"language" as the test firm they will hire.

If I want to outsource any testing tasks for my site, I need to
understand clearly the nature of the tasks, and how the tasks fit into
the overall quality plan for the site.  Many testing tasks can be parsed
out to contractors; for example, hiring another firm to perform
usability reviews makes excellent sense because many firms specialize in
usability testing.  Usability testing, however, is not the same as
quality assurance, and if I hire a usability firm under the assumption
that my quality assurance needs are thereby being handled I will be
making a very big mistake.

If I want to outsource quality control for my site, I'm making certain
assumptions about how I will do business. Quality control is a system
for testing all output against a standard of quality; this is an ongoing
process, because in a proper quality control environment all output has
to be tested.  If I hire a firm to help me test prior to launching a
site, I need to either form a partnership with the test firm to help
with future production or build my own team of quality control testers.
Neither option is necessarily difficult, but I have to consider this
before making the decision to outsource because the choice will affect
my future workflow.

If I want to outsource quality assurance, I'm making a decision to
introduce the test firm deeply into my company's decision process.
Quality assurance is a pro-active process that requires close proximity
to all phases and aspects of the creation and maintenance of a website.
If I exclude the role of quality assurance from important meetings and
discussions -- for example, design meetings -- I'm crippling the
process.  Quality assurance is an order of magnitude more complicated,
more involved, and more important than testing of quality control alone,
and assurance requires constant attention and a penetrating interest in
improving everything.  This understanding must play a role in any
discussion of outsourcing quality assurance.

I do think that a skilled and dedicated testing firm can be a vital part
of a quality assurance effort, as long as the process, and the
relationship, are managed by somebody on the website's team who is
ultimately responsible for the process.  Outsourcing testing tasks is a
great idea, especially if it frees up resources -- time and attention --
for the web team to focus on higher-level issues.  I'd gladly pass off
browser compatibility testing to have more time to work out
architectural issues, for example.

              Issues to Consider When Outsourcing Testing

I spell out some points below that I think are important to address when
making the decision to outsource testing for a commerce site.  This is a
big decision, and approaching the decision itself from a quality
assurance point-of-view shows that outsourcing testing is not> simply a
matter of jobbing out a set of simple tasks.

Understand the scope of what you will outsource.

Look past the buzzwords and learn what testing, quality control and
quality assurance really mean and involve.  Learn about the importance
of these concepts to your site's and your business plan's success.  When
you understand the scope of the task, you can better manage your
expectations -- if you pay for browser compatibility testing but
consider it to be a quality assurance program, you will be disappointed
on several levels.

Who is writing your code?

The source of your programming code is something to consider, because
fitting a contractor into your code development processes may be
awkward.  If you contract out your coding -- for example, say you have a
"technology partner" from whom you buy your site code -- you need to get
the code from the developers to the test firm, and possibly through
several iterations of this cycle, before the code is released.  If the
testing is contracted out, you distance your company's decision makers
from close-up experience with the code, which will be a problem if the
quality of the code is such that you must modify your project plans.

Testing is not quality assurance.

Testing is a method used within the quality assurance process, and if it
is correctly managed, outsourcing some testing tasks makes some sense,
especially if you don't want to invest in the resources required for a
fair-sized lab.  Having a contractor perform browser compatibility tests
in their own lab can be a great saver of time and capital for a busy
start-up venture.

Every change should be tested.  The quality control process measures all
products to verify they meet a standard of quality; if you have a formal
quality control process, then everything that goes out must be tested.
If you outsource testing, then you must make long term decisions on how
to handle all published output.  You'll either need to retain a testing
firm for ongoing production needs, or train your own team.  Either way,
the quality control process is basic to your ability to provide quality
to your customers, and so should be managed by your company.

Quality assurance is about processes.  Quality assurance requires
constant and deep involvement in the planning, design, and development
phases of a major project, because QA focuses on processes.  Getting QA
involved up front will save you money, as studies show that bugs cost
significantly less the earlier they are caught.  Outsourcing QA would
require a very, very close partnership with the contractors.  Ask
yourself if you want to include a testing firm in every design meeting
-- if not, then they won't be able to act effectively in a quality
assurance role.

A successful launch will fork your priorities.  When the website
launches, you will find your priorities splitting in two directions:
maintenance of the site and development of the next site iteration.
Both phases require QA and testing attention, but your greatest resource
will be attention.  If you expect a concerted ongoing web effort, then
you need to involve testing and QA in both tracks.  Ask yourself if you
want to be chained to a contractor's schedule for releasing new code.
If you need to make a change to the code in response to an emergency,
who tests that code before publication to the web?  If testing is to be
performed by a contractor, then they necessarily have a say in the
scheduling by virtue of their availability.

Audience identification is a critical business priority.

Quality assurance is deeply involved in the definition of the site's
audience, including identification of the target browsers and platforms
used by the audience.  I read recently [May 1999] that Windows 2000 is
being released OEM on some brands of PC -- this means I have to evaluate
the importance of this platform to my audience, and, if my research
shows it is necessary, incorporate this platform into my testing.  If I
outsourced my testing, I would have to wait for them to verify my site's
support for this platform; if I outsourced my QA, I'd have to wait for
them to decide how important this platform was to my targeted audience
but this really a business decision.

Quality assurance for websites is a partisan role.

I feel that quality assurance is best done by people with passion
towards the product, the company, and the mission.  The members of a
quality assurance team should be ombudsmen for their audience, and
should represent the user at design and development meetings.  QA staff
should inject a consideration for usability into every process and
discussion.  I don't think you can outsource a passionate QA team, even
though I'm sure you could find an outsource team passionate about doing
a good job of testing.  Ask yourself how important the website is to the
success of your company.  If the website does more than present
brouchureware, then you should consider the website a capital

Who will set and maintain the standards?

When outsourcing a process or elements of a process that is by its
nature ongoing, all parties involved must be clear on who sets the
standards and who maintains the standards. For a testing effort, you
must control those standards, and you must have a mechanism through
which you can change those standards.  You also need a mechanism through
which the contractor can provide you feedback about the appropriateness
and accuracy of those standards.

The management of quality assurance shouldn't be outsourced.

If you outsource any part of your testing process, you should create a
full-time position in your company for somebody to manage quality
assurance; this person should liaise with and set the direction for any
testing contractors.  A wide range of tasks fall under the umbrella of
quality assurance, including testing --  usability, compatibility,
integration, regression, etc. as well as requirements management, design
reviews, and reviewing customer complaints. Don't set an expectation for
quality assurance when you are only paying for testing.

                   Advantages of Outsourcing Testing

Outsourcing some kinds of testing can have definite advantages.

Getting Stuff Done:  Even the best team runs out of the time or bodies
necessary to get certain tests accomplished.  If hiring a test firm to
perform certain tests takes care of some test goals, then outsourcing is
a great idea.

Hiring Expertise:  Some tests are best performed by experienced
professionals, either because the necessary knowledge requires
specialize education or backgrounds, or because the necessary skill set
can't be learned under a short deadline.  Moreover, some tests require a
specialized understanding in order to interpret and analyze the results,
and it makes sense to outsource these tests.

Hiring Authority:  Some kinds of tests -- and some kinds of test firm --
have an authority that a homegrown test team may not be able to compete
with.  Company executives often find it easier to accept critical
results if they come from a third-party, rather than from their own test
team (speaking from experience here).

Hiring a Neutral Point-of-View:  Sometimes having an independent team
perform testing tasks can provide an objective point-of-view about
quality issues like usability or compatibility.  Even if I am rigorously
fair in my compatibility testing, I'll still have browser prejudices;
I'll still have a favorite browser and platform, and so may tend to do
the majority of my testing with them.  Outsourcing some testing may
provide me with more data, and hopefully unbiased data.

Hiring a Fresh Set of Eyes:  Involving a new team in testing exposes the
code to new test tools and methods; a test firm may find defects your
tools were unable to find.


                    New Denial-Of-Service Tool Looms

                           by George V. Hulme
                  (Courtesy of Information Week Daily)

A powerful new distributed denial-of-service tool, dubbed Trinity v3,
has surfaced in more than 400 host computers, possibly threatening a new
wave of denial-of-service attacks, according to Internet Security
Systems Inc. Trinity v3 is not a virus, so hackers have to break in and
install the tool on the Linux system they want to make a "zombie" for an

"Four hundred zombies is certainly enough to bring down a large E-
commerce site, if [the owners] don't have the appropriate intrusion-
detection tools in place," says Chris Rouland, director of ISS's X-Force
research team.

Distributed denial-of-service attacks can bring down a network or Web
site by flooding servers and targeting machines with more traffic than
they can handle, shutting out all valid requests to the systems. In
February, a few big-name sites, including,, eBay, and
Yahoo, were shut down by denial-of- service tools similar to Trinity.

According to ISS, Trinity is controlled by Internet Relay Chat, and in
the version examined by X-Force, the agent binary is installed on a
Linux system at /usr/lib/ When is started, it connects
to an Undernet IRC server on port 6667. Since Trinity doesn't "listen"
on any ports, it's difficult to detect the tool's activity unless system
managers are looking for suspicious IRC traffic. According to X-Force,
any system a Trinity agent resides on may be completely compromised.

More information can be found <>.


            Testing of Object-Oriented Software: Life Cycle
                   by Imran Bashir and Amrit L. Goel

Comments (supplied by book Author Imran Bashir):  This book attempts to
provide guidance in object-oriented(OO) software engineering life cycle
testing issues. This book can be considered a book of testing recipes
for the various stages of OO software engineering. We attempt to cover
all major phases of an OO software engineering life cycle.  We identify
various artifacts at each stage and attempt to provide guidelines for
testing these artifacts. Pragmatically speaking, one may not be able to
utilize or need all recipes described in this book. Instead, depending
on the need, the size of the project, and the availability of tools,
these recipes can be utilized, ignored, or customized.

We explicitly and individually address testing issues during
requirements, design, coding, integration, and system testing phases. We
discuss each of these aspects in a template style. Our template consists
of objective, approach, activities, resources, effort, and acceptance
criteria. For each phase of the OO software engineering life cycle, we
attempt to provide testing recipes based on this template.

The book is intended for four types of audience; software practitioners
who have been using OO technology and wondering whether they should
change their testing practices; software managers who are new to the OO
software; researchers who are strictly interested in the code level unit
and integration testing of OO software; and students of OO technology
who want to learn about the impact of testing throughout the OO software
engineering life cycle."


    6th International Conference on Reliable Software Technologies,


                            CALL FOR PAPERS

                   14 - 18 May 2001, Leuven, Belgium

Sponsored by Ada-Europe, organized by Ada-Belgium and K.U.Leuven, in
cooperation with ACM SIGAda (approval pending).

General Information:  The 6th International Conference on Reliable
Software Technologies (Ada-Europe'2001) will take place in the year 2001
in the historic university town of Leuven, near Brussels, Belgium.  The
full conference will comprise a three-day technical program and
exhibition from Tuesday to Thursday, and parallel workshops and
tutorials on Monday and Friday.

For more information, visit the conference Web site at http://www.ada-

- 16 October 2000: Submission of papers, extended abstracts and
  tutorial/workshop/poster proposals
- 11 December 2000: Notification to authors
- 11 January 2001: Full papers required for accepted extended abstracts
- 12 February 2001: Final papers (camera-ready) required
- 14-18 May 2001: Conference

Topics:  The conference will provide an international forum for
researchers, developers and users of reliable software technologies.
Presentations and discussions will cover applied and theoretical work
currently conducted to support the development and maintenance of
software systems. Participants will include practitioners and
researchers from industry, academia and government.

There will be a special session on e-business and Internet-based
applications, including the use of Ada in this realm.

For papers, tutorials, poster and workshop proposals, the topics of
interest include, but are not limited to:

+ E-business and Internet-based applications (special session).

+ Management of Software Development and Maintenance: Methods,
  Techniques and Tools.

+ Software Quality: Quality Management and Assurance, Risk Analysis,
  Program Analysis, Verification, Validation and Testing of Software

+ Software Development Methods and Techniques: Requirements Engineering,
  Object-Oriented Technologies, Formal Methods, Software Management
  Issues, Re-engineering and Reverse Engineering, Reuse.

+ Software Architectures: Patterns for Software Design and Composition,
  Frameworks, Architecture-Centered Development, Component and Class
  Libraries, Component Design.

+ Tools: CASE Tools, Software Development Environments, Compilers,
  Browsers, Debuggers.

+ Kinds of Systems: Real-Time Systems, Distributed Systems, Fault-
  Tolerant Systems, Information Systems, Safety-Critical or Secure

+ Applications in Multimedia and Communications, Manufacturing,
  Robotics, Avionics, Space, Health Care, Transportation, Industry.

+ Ada Language and Tools: Programming Techniques, Object-Oriented
  Programming, New Approaches in Tool Support, Bindings and Libraries,
  Evaluation and Comparison of Languages, Language Extension Proposals.

+ Ada Experience Reports: Experience Reports from Projects using Ada,
  Management Approaches, Metrics, Comparisons with past or parallel
  Experiences in non-Ada Projects.

+ Education and Training.

+ Case Studies and Experiments.

Submissions:  Authors are invited to submit original contributions.
Submissions should be in English.  An extended abstract (4-6 pages) or,
preferably, the full paper (up to 12 pages) should be sent using the Web
submission form.  For more information please see the conference Web

Conference Chair:

      Karel De Vlaminck
      Department of Computer Science
      Celestijnenlaan 200 A
      B-3001 Leuven (Heverlee), Belgium

Program Co-Chairs

      Dirk Craeynest
      OFFIS nv/sa & K.U.Leuven
      Weiveldlaan 41/32
      B-1930 Zaventem, Belgium

      Alfred Strohmeier
      Swiss Federal Institute of Technology in Lausanne (EPFL)
      Department of Computer Science, Software Engineering Lab
      CH-1015 Lausanne EPFL, Switzerland


              Builds Available for TCAT/Java, TCAT/C-C++

New TCAT/Java and TCAT/C-C++ versions for Windows 98/NT/2000 are now
downloadable from the website.  Both TCAT/Java and TCAT/C-C++ include:

   o  New Instrumenter Build.  The C/C++ and Java instrumenters (icpp9
      and ijava, respectively) have been rebuilt to be compatible with
      Windows 98/NT/2000.  The newest builds support the latest releases
      of MS V6.n and Java Ver. 1.2n, respectively.

   o  Multiple Runtime Choices.

      The new versions provide for internal buffer sizes of 1, 10, 100,
      1000, 10,000, 100,000 in addition to the infinite-buffering
      [default] option.  Most users find that the infinite buffering
      option is the most convenient.

      You can choose the space and time tradeoff combination that suits
      your application best.  Generally you should use the non-infinite
      buffering in case you have an application that terminates so
      abnormally that the last buffer load of test coverage data is not

      The internal buffers in all of the runtime options have been
      programmed to dump data to the assigned tracefile automatically
      after the indicated number of "hits", and are also flushed
      automatically at the end of the execution.

   o  Supplied Examples.  The new releases include updated example
      programs that show how TCAT operates.  The examples supplied
      include a "small" example that can be worked fully with the DEMO
      version of the product.

   o  Coverage Reporting.  Numerous small improvements have been made to
      the main TCAT reporting engines: WinCover, WinDigraph and
      WinCalltree (TCAT/C-C++ only).

   o  DEMO Version Available.  To let you try out TCAT/Java or TCAT/C-
      C++ we have provided a new DEMO version for both products.  The
      DEMO version has limited capabilities:
       - Only the first 25 modules [functions] in any build tree are
       - Coverage reports are limited to just the modules processed.
       - Digraph pictures are available only for modules that were

      This is enough capability so you can get a feel for how the
      product works.  There is a supplied example Java or C-C++ program
      small enough to be processed by the DEMO version.

   o  Other Improvements.  A range of other minor changes and repairs to
      improve operation and simplify processing have been made
      throughout the product.

Download TCAT/Java from:


The Release Notes for TCAT/Java are found at:


Download TCAT/C-C++ from:


The Release Notes for TCAT/C-C++ are found at:


For either product, if after trying out the DEMO version you would like
to replace the DEMO key with an EVAL key that provides you time-limited
access to the complete product, please contact .  Or,
go to the license request form at:



              Skills of a Good Tester, by Alan Braithwaite

So you think you want to be a tester? Do you have what it takes?

What does it take to be a tester? While some may believe that a tester
is someone waiting to become a developer, this is not the case. My
experience has shown me that most testers have no desire to become
developers. Testing requires a different skill set. Over recent years,
software testing has become better understood and hence become a more
legitimate profession. In the five years that I've worked as a software
test engineer, I've seen changes in the software industry that have
caused a greater recognition of the test engineer as a critical part of
the product development team.

Although in this article I will use the term "tester" and "developer", I
prefer the titles "software test engineer" and "software development
engineer". Often we hear people refer to these two positions as "tester"
and "engineer". In reality, both positions are engineering positions.

There are definitely certain skills that an individual must possess to
be a good tester. Though testers hold many skills in common with
developers, there are also certain skills that lend oneself better to
one position over the other. This article will focus on the skills that
make a good tester.

A Broad Understanding

Testers must be able to take in a broad amount of information. Good
testers will have the ability to learn the technical side of things as
well as the practical side. A tester needs to comprehend not only how
the product was designed to work, but how the customer expects it to
work and help bridge that gap. This requires some groundwork on two

Testers must learn customer expectations. This knowledge can come from
others within the organization who regularly meet with customers, but
ideally testers should meet directly with customers. Getting information
second hand can carry the consequence of losing key ideas and technical
issues. This is even more likely if the person passing on the
information does not have a strong technical background. Testers must
also gain understanding of how the product was designed and learn the
details of how it works. This requires them to understand technical
specs and design documents and to quickly learn new technologies.
Testers must also have the ability to create good test plans and test
cases that will appropriately and sufficiently exercise the product. As
mentioned previously, this requires technical as well as practical

Disproving the Theory

In the world of science, whenever someone presents a new theory, the
theory must be put through a rigorous routine where other scientists
attempt to disprove it. In this manner, they either validate or disprove
the theory. In the software development world, a developer's job is to
create a theory (the product). The tester's job is to validate the
theory by trying to disprove it. We know that there are some problems
with the "theory" we have been handed, and our job is to flush those
out. This takes a special kind of skill that some are born with, but
many have to learn it. It requires one to think of the things that can
go wrong. To do this well, a person must be able to see beyond the
obvious. It requires analytical thinking which is why it is well-suited
for individuals who have a science, engineering, or mathematical
background. It also requires a certain amount of paranoia. But don't let
this idea be too unsettling to you because, according to Intel's Andy
Grove, this will make testers be among those who "survive".

Good People and Communication Skills

Having the responsibility to disprove a theory puts the tester in an
awkward position. When a person is told that there are problems with
their theory, it is human nature to either not believe it, or to become
slightly offended.  A developer naturally feels the same way about his
or her code. A good tester must be prepared to handle each of these
possible responses--disbelief and offense. Since testers are required to
find problems that exist, they need to use wisdom and skill in
approaching those who caused the problems. The old saying "A hint of
honey attracts more flies than a gallon of gall" may be appropriate
here. A diplomatic approach to passing on defect information is
required. To help testers understand the importance of this, they must
understand the developer's world.

First, understand that developers are good at what they do. That is why
they still have a job. Second, realize that their job requires them to
put intense focus and concentration into the creation of their modules.
When approached with an issue, some developers may seem a little "put
off" to have someone there bothering them, but this is not too
surprising if you understand where they are coming from. The developer
is probably completely focused on coding or trying to figure out a
technical issue when someone comes and interrupts them. It takes a few
minutes to shift gears and go down a different road. Sometimes it's not
expedient to do so. It may be more important for the developer to figure
out the issue they are currently working on than to stop everything and
focus attention on this new issue.

It is important that testers are aware of this so they don't judge
developers as being apathetic to the tester's issue. On the other hand,
a tester needs to be able to effectively work to get a defect fixed.
Finding a defect is one thing. Getting others to believe it is worth
fixing is a completely different skill. The tester must have good
communication skills, both written and verbal, in order to explain why
it is important for the defect to be fixed.


I've heard it said that anyone can be a tester. That idea is as true for
testing as it is for any other job. But at the same time, to be good at
your job, whether it is testing or something else, you must possess
certain skills. And to excel at your job, you must excel in these skill


 SERG Report: Organizing and Documenting Component-Oriented Toolkits,
                          by Mohammad Radaideh


Abstract:  The concept of Component-Oriented Software Technology has
been advanced as a way to make it easier to build new applications from
components that have been organized into Component-Oriented Toolkits
(CO-Toolkits for short).  However, experience shows that CO-Toolkits are
sometimes hard to understand and use.  Consequently, developers often
write their own code instead of using CO-Toolkits.

Using the Adaptive Communication Environment (ACE) as a case study and
motivating example, this thesis seeks to establish a set of guidelines
for good design practice for the file structures, design and
implementation, and documentation of CO-Toolkits.  The guidelines are
formalized and then checking tools based on these formalized rules are
built.  Finally, the work demonstrates how such tools can be used to
check existing CO-Toolkits code and report rule violations to developers
or reviewers.

      ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 9000 subscribers
worldwide.  To have your event listed in an upcoming issue E-mail a
complete description and full details of your Call for Papers or Call
for Participation to "".

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the QTN issue date.  For example,
  submission deadlines for "Calls for Papers" in the January issue of
  QTN On-Line should be for February and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters; QTN disclaims any responsibility for their content.

STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR logo are
trademarks or registered trademarks of Software Research, Inc. All other
systems are either trademarks or registered trademarks of their
respective companies.

          -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to CANCEL a current subscription, to CHANGE an
address (a CANCEL and a SUBSCRIBE combined) or to submit or propose an
article, use the convenient Subscribe/Unsubscribe facility at:


Or, send Email to "" as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:


   TO UNSUBSCRIBE: Include this phrase in the body of your message:


   NOTE: Please, when subscribing or unsubscribing, type YOUR  exactly and completely.  Note that unsubscribes that don't
   match an email address on the subscriber list are ignored.

		Software Research, Inc.
		1663 Mission Street, Suite 400
		San Francisco, CA  94103  USA

		Phone:     +1 (415) 861-2800
		Toll Free: +1 (800) 942-SOFT (USA Only)
		Fax:       +1 (415) 861-9801
		Web:       <>

                               ## End ##