sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======            August 2000              =======+

QUALITY TECHNIQUES NEWSLETTER (QTN) (Previously Testing Techniques
Newsletter) is E-mailed monthly to subscribers worldwide to support the
Software Research, Inc. (SR), TestWorks, QualityLabs, and eValid WebTest
Services user community and to provide information of general use to the
worldwide software and internet quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the entire
document/file is kept intact and this complete copyright notice appears
with it in all copies.  (c) Copyright 2003 by Software Research, Inc.


   o  4th Annual International Software & Internet Quality Week Europe:
      Conference Theme: Initiatives For The Future.

   o  Complete QWE2000 Tutorials and Technical Program

   o  SERG Report 389: Limitations of Backward Error Analysis Dr.
      Sanzheng Qiao and Dr. David Parnas

   o  Test Careers Post 2000, by David L. Moore

   o  eValid Ver. 2.1 Now Available

   o  Quality Week 2000 - "Ask the Quality Experts!" Panel Summary (Part
      3 of 3)

   o  QTN Article Submittal, Subscription Information


1. 4th Annual International Software & Internet Quality Week Europe:
Conference Theme: Initiatives For The Future.

The complete program for the 4th International Software & Internet
Quality Week Conference [QWE2000] (Theme: Initiatives For The Future) to
be held 20-24 November 2000 in Brussels, Belgium EU, is now available


and is also included in the next article.

The QWE2000 International Advisory Board has assembled a terrific
international team of over 60 speakers.  Discover the state-of-the-art
in software and internet QA and testing from around the world:  from the
USA to the UK, from Canada to Brazil, from Europe to China!

Our goal of bringing together industry and academic, software and
internet oriented, European and non-European specialists, has been

The QWE2000 Program features:

* Pressing questions and issues discussed by a distinguished lineup of
  Industrial Keynote Speakers:

   - Tom Gilb (Results Planning) The Ten Most Powerful Principals
     for Quality in Software Organizations
   - Jens Pas (I2B) Test Out-Sourcing: From Necessary Evil to E-
     Competitive Advantage
   - Lisa Crispin (iFactor-e) Stranger in a Strange Land: Bringing
     QA to a Web Startup
   - Hans Buwalda (CMG Finance) Soap Opera Testing
   - Tom Drake (ICCI) The Future of Software Quality - Our Brave New
     World-Are We Ready?

* 12 pre-conference Tutorials conducted by the foremost experts in their

* Five Parallel Tracks that cover the broad field of software quality
  with the latest developments:

   - Technology: From browser-based website testing to UML methods
   - Internet: E-commerce experience, Internet Time and Site
   - Applications: Hear solutions from researchers and practitioners
   - Management: Managing Testing, Quality Improvement, Process
   - Vendor Technical Presentations allow you to broaden your tools
     and services information.

* For The First Time Ever... The Information Systems Examination Board
  (ISEB) of the British Computer Society has accredited the standard
  full course for delegates at Quality Week Europe with some experience
  in testing, who wish to take the examination leading to the Foundation
  Certificate in Software Testing.  The course will take 18 hours
  including the "closed book" exam which consists of forty multiple-
  choice questions.  The exam will be offered during the conference
  supervised by an ISEB proctor.  The exam results and ISEB certificates
  will be presented at a special announcement during QWE2000.

* Industry Exhibitors who will showcase their services and latest
  products at the Two-Day Trade Show (Expo: 22-23 November 2000).
  Exhibitors include: Amphora Quality Technologies, CMG, eValid, Gitek,
  I2B, McCabe Associates, NexusWorld, PolySpace, ps_testware, RadView,
  Rational, Software Emancipation, SIM Group, TBI, and more.

*  Special Events: Attendees will tour two small family-run factories to
  sample Belgium's most famous products:  Belgian Chocolate and Belgian

Mark your calendars *NOW* for QWE2000, in beautiful downtown Brussels,
from 20-24 November 2000.  Register early on-line and receive Early Bird
Special Pricing for the Quality Week Europe 2000 Conference.


            Complete QWE2000 Tutorials and Technical Program

                           T U T O R I A L S

         Monday, 20 November 2000, 9:00 - 12:00 -- AM Tutorials

Dr. Gualtiero Bazzana (ONION, S.p.A.)  "Web Testing Master Class (A1

Mr. Ruud Teunissen & Rob Baarda (Gitek) "Risk Based Test Effort
Estimation with Test Point Analysis (B1)"

Mr. Desmond D'Souza (Consultant) "E-Business: Leverge Component-Based
Development and UML Modeling (C1)"

Mr. Mike Russell (Insight Consulting (In Cooperation with System
Evolutif)) "ISEB Software Testing Foundation Certificate (D1 D2) (H1

        Monday, 20 November 2000, 14:00 - 17:00 -- PM Tutorials

Dr. Gualtiero Bazzana (ONION, S.p.A.)  "Web Testing Master Class (A1

Ms. Alice Lee & Dr. Eric Wong (NASA Johnson Space Center) "A
Quantitative Risk Assessment Model For Software Quality, Testing And
Safety (B2)"

Dr. Hans-Ludwig Hausen (GMD) "On A Standards Based Quality Framework for
Web Portals (C2)"

Mr. Mike Russell (Insight Consulting (In Cooperation with System
Evolutif)) "ISEB Software Testing Foundation Certificate (D1 D2) (H1

        Tuesday, 21 November 2000, 9:00 - 12:00 -- AM Tutorials

Mr. Tom Gilb (Result Planning Limited) "Requirements Engineering for SW
Developers & Testers (E1)"

Mr. Robert A. Sabourin (Purkinje Inc.)  "The Effective SQA Manager -
Getting Things Done (F1)"

Mr. Adrian Cowderoy ( Limited) "Cool Q- Quality
Improvement For Multi-Disciplinary Tasks In Website Development (G1)"

Mr. Mike Russell (Insight Consulting (In Cooperation with System
Evolutif)) "ISEB Software Testing Foundation Certificate (D1 D2) (H1

        Tuesday, 21 November 2000, 14:00 - 17:00 -- PM Tutorials

Mr. Tom Gilb (Result Planning Limited) "Specification Quality Control
(SQC): A Practical Inspection Workshop on Requirements Specification

Mr. Tom Drake (Integrated Computer Concepts, Inc.)  "The Quality
Challenge for Network-Based Software Systems (F2)"

Mr. Tobias Mayer & E. Miller (eValid, Inc.)  "WebSite Testing (G2)"

Mr. Mike Russell (Insight Consulting (In Cooperation with System
Evolutif)) "ISEB Software Testing Foundation Certificate (D1 D2) (H1

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                   T E C H N I C A L   P R O G R A M

    Wednesday, 22 November 2000, 8:30 - 10:00 -- KEYNOTE SESSION #1

Mr. Tom Gilb (Result Planning Limited) "The Ten Most Powerful Principles
for Quality in Software Organizations (K11)"

Mr. Jens Pas (I2B) "Test Out-Sourcing: From Necessary Evil to E-
Competitive Advantage (K12)"

Wednesday, 22 November 2000, 11:00 - 5:00 -- Parallel Technical Tracks


Mrs Miriam Bromnick [UK] (Ovum ltd) "Automated Software Testing: A New
Breed of Tools (1T)"

Mr. Tom Hazdra & Lubos Kral [Czech Republic] (CertiCon) "Enhancing the
Integration Testing of Component-Based Software (2T)"

Dr. Antonia Bertolino, F. Basanieri [Italy] (CNR-IEI) "A Practical
Approach to UML-Based Derivation of Integration Tests (3T)"

Ms. Lisa Crispin ( "The Need for Speed: Automating Functional
Testing in an eXtreme Programming Environment (4T)"

Mr. Ruud Teunissen (Gitek) "Improving Developer's Tests (5T)"


Mr. Jean Hartmann, Mr. Claudio Imoberdorf [USA] (Siemens Corporate
Research) "Functional Testing Of Distributed Component-Based Software

Mr. Olaf Mueller & Mr. Alex Podshwadek [Germany] (Siemens) "A Step-to-
Step Guide to Incremental Testing: Managing Feature Interaction for
Communication Devices (2A)"

Mr. Rob Hendriks & Mr. Robert van Vonderen & Mr. Erik van Veenendaal
[Netherlands] (Improve Quality Services) "Measuring Software Product
Quality During Testing (3A)"

Mr. Steve Littlejohn [UK] (SIM Group Limited) "Test Environment
Management -- A Forgotten Basic (4A)"

Dr. Erik P. vanVeenendaal [Netherlands] (Improve Quality Services BV)
"GQM Based Inspection (5A)"


Dr. Lingzi Jin [UK] (FamilyGenetix Ltd.)  "Introducing Quality Assurance
Into Website Development: A Case Study For Website Quality Control In A
Small Company Environment (1I)"

Ms. Nicole Levy [France] (Laboratore PRISM) "Quality Characteristics to
Select an Architecture for Real-Time Internet Applications (2I)"

Mr. Massimiliano Spolverini [Italy] (Etnoteam - Consulting Division)
"Measuring And Improving The Quality Of Web Site Applications (3I)"

Mr. Adrian Cowderoy ( Limited) "Complex Web-Site Cost More
To Maintain- Measure The Complexity Of Content (4I)"

Mr. Rakesh Agarwal, Bhaskar Ghosh, Santanu Banerjee & Soumyendu Pal
[India] (Infosys Technologies Ltd) "Challenges And Experiences In
Establishing WebSite Quality (5I)"


Mr. Ton Dekkers (IQUIP Informatica B.V.)  "Quality Tailor-Made (QTM)

Mr. Kie Liang Tan [Netherlands] (Division Testmanagement & Consultancy)
"How To Manage Outsourcing Of Test Activities (2M)"

Mr. Kees Hopman [Netherlands] (IQUIP Informatica BV) "How to Implement
New Technologies? Four Proven Cornerstones for Effective Improvements.

Mr. Andreas Birk & Wolfgang Mueller [Germany] (Fraunhofer Institute)
"Systematic Improvement Management: A Method For Defining And
Controlling Customized Improvement Programs (4M)"

Mr. Oliver Niese, Tiziana Margaria, Markus Nagelmann, Bernhard Steffen,
Georg Brune & Hans-Dieter Ide [Germany] (METAFrame Technologies GmbH)
"An Open Environment For Automated Integrated Testing (5M)"

     Thursday, 23 November 2000, 8:30 - 10:00 -- KEYNOTE SESSION #1

Ms. Lisa Crispin ( "Stranger in a Strange Land: Bringing QA to
a Web Startup (K21)"

Mr. Hans Buwalda [Netherlands] (CMG Finance BV) "Soap Opera Testing

 Thursday, 23 November 2000, 11:00 - 5:00 -- Parallel Technical Tracks


Mr. Francesco Piazza [Italy] (Alenia Aerospazio) "A Requirements Trace
Application (6T)"

Mr. Gary Mogyorodi [Canada] (Technology Builders, Inc.)  "Requirements-
Based Testing: An Overview (Process and Techniques for Successful
Development Efforts) (7T)"

Mr. Tobias Mayer (eValid, Inc.)  "Browser Based Website Testing
Technology (8T)"

Dr. Rainer Stetter [Germany] (Software Factory & ITQ Gmbh) "Test
Strategies for Embedded Systems (9T)"

Dr. Ray Paul & Dr. Wei-Tek Tsai [USA] "Assurance-Based Testing: A New
Quality Assurance Technique (10T)"


Mr. Jacobus DuPreez & Lee D. Smith (ARM Ltd.)  "SPI: A Real-World
Experience (6A)"

Ms. Jill Pritchet & Mr. Ian Lawthers [Ireland] (Centre for Software
Engineering) "Software Process Improvement for Small Organizations using
the "SPIRE" Approach (7A)"

Mr. Gunthard Anderer [Germany] (CMG ORGA - Team GmgH) "Testing E-
Commerce Systems - Requirements And Solutions (8A)"

Dasha Klyachko [UK] (Allied Testing) "Specifics of E-Testing: Difference
Between Traditional and On-Line Software Development and Its Effect on
Testing (9A)"

Mr. Bob Bartlett [UK] (SIM Group Limited) "A Practical Approach to
Testing your eCommerce Web Server (10A)"


Mr. Olivier Denoo [Belgium] (ps_testware) "Usability: A Web Review (6I)"

Mr. Fernando T. Itakura, Ms. Silvia R. Verfilio [Brazil] (Crosskeys
Systems Corporation) "Automatic Support For Usability Evaluation Of A
Web Application (7I)"

Mr. Mark Atkins [USA] (Vality Technology) "Ensuring Data Quality for E-
Commerce Systems (8I)"

Mr. Adrian Cowderoy [UK] ( Limited) "Technical Quality Is
Just The Start -- The Real Battle Is Commercial Quality (9I)"

Mr. Robert L. Probert, Wujun Li, Mr. Paul Sims [Canada] (School Of
Information Technology And Engineering) "A Risk-Directed E-Commerce Test
Strategy (10I)"


Mr. Karl Lebsanft & Mr. Thomas Mehner [Germany] (Siemens AG) "CMM in
Turbulent Times - Is CMM a Contradiction to Innovation? (6M)"

Mr. Luis Filipe D. Machado, Ms. Kathia M. de Oliveira & Ms. Ana Regina
C. Rocha [Brazil] (Federal University Of Rio de Janeiro) "Using
Standards And Maturity Models For The Software Process Definition (7M)"

Mr. Martin S. Feather, Mr. Tim Kurtz [USA] (NASA Glenn Research Center)
"Putting It All Together: Software Planning, Estimating And Assessment
For A Successful Project (8M)"

Dr. Esther Pearson [USA] (Genuity) "Website Operational Acceptance
Testing: Process Assessment and Improvement (9M)"

Mr. William E. Lewis [USA] (Technology Builders) "A Continuous Quality
Improvement Testing Methodology (10M)"

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

  Friday, 24 November 2000, 9:00 - 11:00 -- Parallel Technical Tracks


Mr. Richard Kasperowski & Mr. Spencer Marks [USA] (Altisimo Computing)
"Building Better Java Applications (11T)"

Mr. Sanjay DasGupta & Indrajit Sanyal [USA] (Usha Communications
Technology) "A Java-XML Integration for Automated Testing (12T)"


Mr. Nigel Bevan, Mr. Itzhak Bogomolni [UK] (Serco Usability Services)
"Incorporating User Quality Requirements In The Software Development
Process (11A)"

Mr. Adam Kolawa [USA] (ParaSoft) "Testing Dynamic Web Sites (12A)"


Mr. Bob Bartlett [UK] (SIM Group Ltd.)  "Experience Testing E-commerce
Systems (11I)"

Mr. Steven D. Porter [USA] (Practical Consulting Group) "From Web Site
To Web App: Ensuring Quality In A Complex Environment (12I)"


Mr. Vassilios Sylaidis, Mr. Dimitrios Stasinos, Mr. Theodoros [Greece]
(INTRACOM S.A.)  "Software Development Process Improvement For
Telecommunications Applications By Applying Gilb's Inspection
Methodology (11M)"

Ms. Tuija Lamsa [Finland] (University of Oulu) "Using Knowledge
Management in the Quality Improvement of the Process (12M)"

     Friday, 24 November 2000, 11:00 - 12:00 -- KEYNOTE SESSION #3

(European Commission Speaker) "European Commission Initiatives in
Software Technology (K31)"

Mr. Thomas A. Drake (Integrated Computer Concepts, Inc. (ICCI)) "The
Future Of Software Quality - Our Brave New World - Are We Ready? (K32)"

                     QWE2000 ADVISORY BOARD MEMBERS

    Gualtiero Bazzana (Onion, Italy) -- Boris Beizer (Analysis, USA)
  Antonia Bertolino (IEI/CNR, Italy) -- Nick Borelli (Microsoft, USA)
Rita Bral (SR/Institute, USA) -- Gunther Chrobok-Diening (Siemens, Germany)
 Adrian Cowderoy (NexusWorld, England) -- Sylvia Daiqui (DLR, Germany)
      Tom Drake (CRTI, USA) -- Istvan Forgacs (Balthazar, Hungary)
Dick Hamlet (PDU, Ireland/USA) -- Franco Martinig (Martinig&Assoc, Switzerland)
  Edward Miller (SR/Institute, USA) -- Michael O'Duffy (CSE, Ireland)
         Jens Pas (I2B, Belgium) -- Linda Rosenberg (NASA, USA)
     Henk Sanders (CMG, Netherlands) -- Harry Sneed (SES, Germany)
Bernhard Steffen (Univ. Dortmund, Germany) -- Ruud Teunissen (GiTek, Belgium)
Wei-Tek Tsai (Arizona State, USA) -- Erik VanVeenendaal (IQS, Netherlands)
                 Otto Vinter (Bruel and Kjaer, Denmark)

            R E G I S T R A T I O N   I N F O R M A T I O N

Complete registration with full information about the conference is
available on the WWW at


where you can register on-line.

We will be pleased to send you a QWE2000 registration package by E-mail,
postal mail or FAX on request.  Send your E-mail requests to:


or FAX or phone your request to SR/Institute at the numbers below.

          QWE2000: 20-24 November 2000, Brussels, Belgium  EU

| Quality Week Europe Registration  | Phone:       [+1] (415) 861-2800 |
| SR/Institute, Inc.                | TollFree (USA):   1-800-942-SOFT |
| 1663 Mission Street, Suite 400    | FAX:         [+1] (415) 861-9801 |
| San Francisco, CA 94107 USA       | E-Mail:     |
|                                   | WWW: |


        SERG Report 389: Limitations of Backward Error Analysis
                 Dr. Sanzheng Qiao and Dr. David Parnas

Abstract:  We know that a small backward error given by a backward error
analysis of a numerical method ensures the stability of the method. In
this paper, we show, through examples, that a large backward error or
non-existence of backward error does not imply instability.  In fact, a
method can be stable and deliver accurate results although the backward
error is large.

The web address for downloading reports is:


                         Test Careers Post 2000
                            David L. Moore

The year 2000 crisis has come and gone and, depending on your
perspective, it was a big let down, a great disaster or a tremendous
success. But how did the immovable deadline and the forced testing
regime affecting the computer industry? In particular how has it
affected the people that some see as having benefited the most -

Y2 OK - Not!

It seems like the year 2000 problem is ancient history now. But shortly
after the rollover, media from around the world were reacting very
differently to the same uninteresting outcome.

In the USA, in what some would say is a typically American response,
they were congratulating themselves on a job well done. No planes had
fallen out of the sky, no nuclear power plants had melted down and
everyone seemed to have water and power except for those affected by the
usual collection of natural disasters.

In Australia and the UK questions were being asked as to why we spent so
much time and money on an event that was such a fizzer. Was it all a
scam by the computer industry to dupe the technologically impaired?

Kelvin Ross, director of Queensland based software-testing consultancy
K. J. Ross and Associates, seems to think not. "Some of the defects that
were found during testing could have had far reaching consequences," he
said. "It is mainly the smaller organizations that were led to believe
that PCs and embedded systems would be a problem that are seeing it as a
bit of a dud."

Anyone involved in the effort knew that the goal was a smooth transition
across the date rollover. The last thing that anyone likes doing is
spending money to end up exactly where they started. Aside from that it
isn't interesting. The popular media had to try to get some value for
money out of the story. The world hadn't ended so the story had to be
spun some way to make it interesting.

Dan O'Leary of Morgan and Banks Technology, who recruits software
testers for IT companies, put his own spin on it. "We've just spent
billions and billions of dollars and created a whole new industry in
which software testing is a significant part. Imagine the questions that
would have been asked if all that had failed and Y2K was a disaster."

According to Ross, a number of larger organizations were shocked to see
some of the errors that were found by Y2K, but he said that it also
pointed to systems that had pretty poor quality already.  "I think that
these larger organizations saw it as spending money to get their house
in order, and to set up some processes for the future."

Testing opportunities

It would be easy to think that the year 2000 issue was over and all the
testers could crawl back into the holes from which they came, not before
taking a long holiday on all the money they earned. But has this issue
really been put to rest? From a testing perspective, maybe, but  Ross is
"... sceptical (that) we haven't heard the full story, even still.  I
know of a few isolated cases with banks and large retailers that have
had problems in parts of their business, but were ordered not to mention
the 'Y2K' word and to treat it as a normal day-to-day problem."

Whether it is a day-to-day problem or residual year 2000 problem the
fact is there is a quality problem and testing and testers can help.

O'Leary points out that many of the organizations that brought on
testers for year 2000 issues suddenly realized that they played an
important role in the company's future and kept them on thereafter. An
awareness of the need for quality resulted from the enforced rigor of
year 2000 remediation. There hasn't been a flood of testers onto the
market as predicted. Many have stayed on. To a degree development seems
to have learned its lesson.  Some of this can be attributed to the fact
that many problems, aside from year 2000 ones, were found during
remediation. O'Leary also says that, through necessity, these non-year
2000 issues were put on the back burner and are only now just having the
heat turned back up. There is a lot of catching up to do it seems.

But what of other impending software chaos? What about the GST
implementation? We don't have two thousand years warning for that one.
Do you think we'll be ready? O'Leary believes there are two aspects to
the GST testing issues. One is that the GST was included in the year
2000 remediation and is part of the backlog being addressed as described
above and the other is that the business analysis is still being
undertaken and that we just don't know the impact yet.

In Europe, Joe Marshall, director of recruitment agency People in
Computers, sees "...with the conversion work coming up for the EURO I
see quite a lot of people being needed later this year (and) perhaps
early next year."

In the USA, social security numbers are about to run out. The existing
numbering scheme doesn't have enough digits and, for obvious security
reasons, they cannot recycle them. For those with X-files like paranoia
the size of this remediation task could be/is huge.

However, getting work in the UK and the USA is far from straightforward
and my information is that testers are better regarded in Australia than
they are most other places. We certainly seem to be paid better than our
foreign counterparts. According to Dan, Australia is also experiencing
an influx of testers from these places. Some are Australians returning
home and some are tourists on working visas. Maybe these places are now
experiencing tester shortage as a result of this exodus or perhaps there
is actually little for them to do.

A drama is a good place for somebody, anybody, to get a start in
testing. Y2K proved it. If you have some testing experience, want to
progress your testing career and learn how to do something other than
panic, my advice is to avoid the dramas and hunt down the jobs with
companies that know what they are doing.

How do you become a tester?

I haven't met a single tester who made the choice to become a tester
before entering the work force. We have all arrived here by some
roundabout means. It is hard to imagine a young child tugging at his
father's shirt tail and exclaiming that he wants to be a tester in the
same way that you and I may have said that we wanted to be a police

Many of the people called in to do year 2000 testing were experienced
users of the system undergoing remediation. For other systems, people
often get dragged in for user trials and somehow wind up becoming
testers. This would indicate that a tester doesn't need to possess any
particular skills. But nothing could be further from the truth.

Many testers drop out of development, disgruntled with unrealistic
deadlines and being unable to produce good products. It may seem easier
to pick holes in the work that others do than write your own bugs. It
isn't long before this simplistic view of testing either forces people
back to development or wakes them up to the complex nature of building
quality into a product rather than tacking it on at the end. Others
might find, much as I did, that really they had been a tester trapped in
a developer's body and had really been testing all along. They had just
been doing it within a development role.

Technical writers often become testers. Charged with the duty of
producing a user manual they are often the first, and only, person to
put a system through its paces. If the technical writer possesses some
degree of work ethic they are duty bound to report the inevitable errors
that they find. A similar sort of thing may happen to business analysts.
Having defined the system or the business rules, they are the natural
selection to ensure that the system does what they wanted it to.
However, for both technical writers and business analysts, the jump to
testing may feel like a regression rather than an evolution. Spotting
the potential of a testing job from this perspective may be difficult.
Business analysts and technical writers are often already more senior
than the testing role they are forced to perform.

Acquaintances swear to me that computer science students and budding
software engineers are taught all the best software testing practices.
The problem is that none of them become testers. They all become
developers or engineers and have the optimism and quality practices
squeezed out of them by the first deadline they meet. If you don't use
it you lose it and before you know they are hacking code to meet
diabolical deadlines like the rest of the software industry. "Unit
testing, you've got to be joking. I don't have time for that!".

Above all though, I believe people are born to test. Almost everything
you buy has a defect one way or another. As soon as I joined the work
force, not in IT mind you, I earned the nickname "Errol". You'll have to
email me for the reason. It is how you go about things that make you a
tester, not what you used to do.

The Y2K testing bandwagon

During the lead up to the millennium crisis it seemed like anyone with
the word "test" in their resume was being hired at exorbitant rates to
help tackle the problem. A combination of ignorance of the testing role
by employers and desperation meant that many non-testers jumped onto the
bandwagon. There was a significant risk that neither party was going to
come out of the relationship happy. Dan O'Leary makes the point that
"... a lot of what was called testing was really vendor analysis. The
vendors were supplying information and the 'testers' were just running
comparisons. It wasn't really testing."

But where have all these "testing wanna-bes" gone? The expected flood of
testers post Y2K has not occurred. Many of the people that jumped onto
the bandwagon have jumped back off. Accordingly to O'Leary many of the
non-IT sourced testers have gone back to their non-IT origins. Similarly
for the IT development sourced testers. Most have gone back to
development. When I asked him if he thought they'd returned with a new
found understanding and respect for the testing profession he answered

The depth to which many of these people had inspected systems meant that
they now had obtained valuable system knowledge. Many employers, while
not keeping them on in a IT role, kept them on from the business
perspective, reluctant to let these vessels of knowledge leave.

Kelvin Ross sees it slightly differently; "As far as other testers
flooding the market I don't know if that is happening.  A few are
trying, but because they (weren't part of) an existing test organization
with runs on the board they are having difficulties.  It may be more a
case of the test contractor market being flooded, but I haven't really
seen any evidence of that either."

The original motivation for many Y2K testers was obvious. The financial
incentive was great. Of those that became testers for this reason, while
better pay is still a factor, O'Leary believes that they are staying in
testing for the "right reasons" - a sense of achievement, variety of
work and a feeling of improving the product. As for the rest Ross says
"...I think these guys will disappear.  And perhaps it will force people
to look for testing skills other than Y2K only."

Many people that were already testing prior to Y2K avoided it like the
plague. Even those involved referred to it as ground hog day , "I wake
up each morning and we are still trying to push through the same test
which the system keeps rejecting, or needs to be rerun as the system has
changed." The work was seen as mechanical and dull. There is a limit to
how many times the average intelligent human can type in the same subset
of numbers without going insane.

The existing career testers have come through this crisis much better
off than expected. These people are now quite clearly senior to the
majority of Y2K testers and they now have a pool of resources that are
keen to test and learn. Employers are now looking to these "senior
testers" to implement testing improvements, convinced that testing is
valuable and being more aware of what good testing should look like.

Becoming a better tester

As with most careers the best way to become better is to want to become
better and never assume that you know it all.

Obtaining further education as a tester is a difficult task. Courses are
few and far between. Good courses are even more difficult to find.
Nevertheless, you have to track down every scrap of training you can. If
you can walk away from a course with one or two practical points to make
your daily job better and easier, then it has been worthwhile.

Certification of testers is a hot topic these days and I am sceptical
about its value. Certification is available in the USA and the UK and
opinions of its value vary. In Australia some short courses offer the
option of examination. I am pretty sure that if I were put through a
certification process I'd fail. Having to stay within a narrow band of
criteria would feel like a straight jacket to me. The tester in me would
have to step outside of the certification and test the certification
itself. It would be like trying to certify an artist. It can't be
coincidence that one of the benchmark books on testing is called "The
Art of Software Testing".

There are plenty of tool vendors about that periodically host seminars.
Despite the sales oriented nature of these events they are good places
to pick up some tips and ideas. They are also good places for
networking. Similarly SPIN (Software Process Improvement Networks)
groups and SQA (Software Quality Association) meetings can be a good
source of information.

Testing conferences have not taken off in Australia. There have been a
few attempts - most tool vendor sponsored. The cynical nature of testers
tends to keep most away from these.

There is a need for a well-publicized independent conference. First hand
discussions with some of the worlds leading test authorities indicate
that "If you build it, they will come". At the moment there doesn't
appear to be an Australian body with enough interest and money to stage
such a conference. If there is, contact me, I'd like to be involved.

The international conferences, of which there are many, are fantastic
sources of all things test related. The expense of these is a barrier
for most Australian testers. It is a pity because you'll get more out of
these than you will any other format of testing education.

Fortunately there are loads of books on testing and even a couple now on
the dangerous area of software test automation. My advice is to go for
the practical ones. There are plenty of academic and scientific works on
the subject. While interesting, they won't help your day-to-day job all
that much.

One of the best sources of test information on the web is the newsgroups FAQ. It contains, and points to, a swag
of testing information. The testing gurus of the world frequently
participate in lively debates in this newsgroup as well.

Dan O'Leary says the "...testers that stand out are the ones that get
out and talk to people and find out what the business' needs are." They
get out of the test environment, interface between development, business
and users. Those who lock themselves in an environment and do testing as
they think it should be will probably not succeed as much as those that
take the opportunity to get out there and look at the real world. They
need to see what the people expect of the product, not just what the
developers and company want. Testers that have been involved in the
entire product lifecycle, from start to finish, the "completers" will
stand out. O'Leary also says that more employers are aware of software
development and testing standards so your average tester has to be aware
of how to work within these frameworks.

A testing career

Perhaps one of the things that does the most disservice to testing as a
career is the perception that it is a simple task to be performed by
junior staff. Kelvin Ross says that testing "... definitely takes a
mind-set quite different to development, and there a quite a few people
that really fit into that mindset. I think testing is definitely a
career path."

While it still seems that the highest a tester can go is Test Manager
O'Leary points out that this really should not be the limit within
quality-focused companies. The perspective that good testers have of the
product lifecycle, customer focused quality and dealing with people
means that they have the potential to be better project managers than
the typically development sourced project managers. It makes sense. The
compartmentalization of the typical development career means that
development sourced project managers have "seen it all", but only a
piece at a time. Testers tend to have a big picture view right from the
earliest days of a project.  Extrapolation of this idea makes it clear
that, within a quality focused company, there should be no limit to how
high up the corporate ladder you can go. You'll still have the tester's
mind set, you just won't be getting your hands dirty. The irony is that
you will have become one of the people that you have been battling for
understanding your entire career; just don't forget your roots!

Alternative career paths also include contracting and consulting. While
potentially lucrative there are significantly different mind-sets
required for the career tester, the contractor and the consultant. Not
everyone is suited to contracting and even less fit into the consultant
frame of mind. O'Leary sees that "contracting and consulting is not
necessarily a step up". Many people looking for an upward career move in
these avenues end up getting their hands dirty again when what they
really wanted to do was higher level strategic activities.

Growth business areas for testers

Superficially there may seem to be a testing vacuum created by the
passing of Y2K. However, the increasingly pervasive nature of the IT
industry into every day life means that there will always be plenty of
software to test. Significant and obvious bugs can be found in consumer
products with embedded systems. My mobile phone has a menu bug and my
television has an on screen menu with an annoying undocumented

E-business is exploding and with this growth comes the pressure for
existing companies to keep up. This is often at the expense of quality.
A casual survey of testing peers indicated that not one of them would
enter their credit card number into an online system, and neither would
I. This is not just tester cynicism. Another casual survey of people who
have participated in online purchases revealed that roughly 80 per cent
of them had experienced fraudulent purchases on their credit card
subsequent to the on-line transaction. Most of these people also
reported multiple incidents. While the consumer is protected to a degree
from liability, there comes a point where it is easier to cancel the
card and start again than fight all the battles over mysterious

It won't take long for online businesses to realize that in the long run
they are paying for this, not the consumer. Clearly there is plenty to
be fixed here and testing must be an integral part of this.

The Internet is increasingly the primary public face for many companies.
There are numerous web sites that convey little more than a slip shod
approach to business. It is not hard to find poorly designed and
performing web sites. You only have one chance to impress a potential
customer and the time you have to do that in on the Internet is measured
in seconds.

Windows 2000 has been released and, by some reports, contains 63000
defects. The shrink-wrapped windows based application market now has to
catch up and as long as Microsoft keeps fixing bugs and deficiencies
they will have to keep catching up. There is plenty of work in this can
of worms for testers and developers alike.

The emergence of Linux as a serious contender in the OS market means
that commercial products will have to migrate to it. With this burst of
development a corresponding test effort must take place or Linux will
become the same bloated unstable wreck that many believe windows already

Prepare now for Y1M

The bottom line is that as long as software development is taking place
there will always be a need for software testers. The nature of modern
software and development is constantly imposing greater testing
problems. It is merely the awareness of the need, and value, of testing
that will cause fluctuations in the job market for testers. At the
moment that awareness is growing and so too is the need for good

One thing still concerns me though, as a tester I am forced to question
when the real Y2K problem is going to occur. According to metric
definitions a "k" is 10^3, that is, 1000. But we have been told that the
problem is actually related to "K" which is 2^10, that is 1024. I don't
know about you, but I couldn't go through all this again in the year
2048 even if I am still alive.

David L. Moore is a director of Chequered Gecko Pty Ltd, a Sydney based
software testing consultancy focusing on inception to completion process
improvement. He can be contacted through his web site at


eValid Ver. 2.1 Now Available

eValid is a Test Enabled Web Browser(tm) that performs all the functions
needed for detailed WebSite static and dynamic testing, QA/Validation,
page tuning, and load generation.  eValid runs on Windows 98/NT/2000.

eValid's Ver. 2.1, now available for download, includes a rich feature
set that makes your WebSite quality project as simple as possible:

   * All Functions Available In Menu Pulldowns
     - Full capability browser (100% IE compatible)
     - Totally ObjectMode plus RealTime record/play operation
     - Simple, editable script language
     - Playback logfiles are spread-sheet, database ready
   * Functional Testing Support
     - Multiple validation modes
     - Screen area validation and synchronization
     - Secure session support
     - Java Applet, ActiveX Control support
     - JavaScript, VBScript support
     - Tests with Flash, Shockwave, etc.
     - Wizards to test links, objects, elements, etc.
   * Script Playback Control Flexibility
     - Pause/Resume, Run Multiple, Run Forever
     - Session timing limits and alarms
     - Command line interface, API
   * WebSite Page Tuning
     - Complete, detailed timings including page rendering times
     - Page component timing breakdown
     - Build-in graphical reporting
   * Loading and Capacity Analysis
     - 100% browser based user scenario simulations
     - Multi-user load scenario definition
     - Multiple browser (e.g. 250+) auto-launch
     - Dialup modem simulation

See what people are saying about eValid:

Try out a DEMO Version of eValid Ver. 2.1 by downloading from:

Or, download the FULL version and request an EVAL key from:

eValid's Page Tuning feature is illustrated at:

The eValid LoadTest feature is described at:

Details from ; license keys from 


     Quality Week 2000 - "Ask the Quality Experts!" Panel Summary
                             (Part 3 of 3)

      Note: This discussion is the approximate transcript of the
      "Ask the Quality Experts!" panel session at QW2000.  The
      questions were posed to a web page sponsored by Microsoft
      (courtesy of Panel Chair, Nick Borelli) who voted on each
      topic.  The top-voted questions were answered first.

                Ask the Quality Experts! Panel Members:

                    Mr. Nick Borelli, Microsoft, USA
             Dr. John D. Musa, Independent Consultant, USA
                    Prof. Lee Osterweil, UMASS, USA
                      Mr. Thomas Drake, ICCI, USA
                Mr. Robert Binder, RBSC Corporation, USA

*** How many defects ARE there in Windows 2000?

Interesting question.  Everyone has probably heard the 63,000 rumor. The
question is, how many lines of code are in Windows 2000.  Is 63,000 good
or bad?  Understanding bugs per line of code is good.  It's interesting
that there are bugs in the software, but how many of those bugs are
going to be found, and how likely are they to be found and by how many
people?  The process for getting something so immense is not an easy
task, and trying to lock down a huge software base is a difficult job.
When I first got into QA I really thought zero-defect software had to be
achievable.  But until your really understand what that would mean or
that would cost, you'll understand that that is not achievable.

I did some studying... If we assume that there are 35,000,000 lines for
Windows 2000, then take the published report of 63,000 bugs, when you
run the conversion, that's 1.77 bugs for 1000 lines of code, which is
quite low in several circles according to one of the panelist.  Another
disagreed and said it was an order of magnitude worse than normal.

> Comment from audience - It's important to differentiate between
"defect" and "bug"

One of the panelist asked Nick Borelli if there was an official number?
Nick responded that we certainly don't disclose an official number of
bugs.  I don't know what the number is.

As a user, is the product more reliable?

How long does it run without a failure?  I don't care how many bugs are
in there.  This whole bug number is the wrong way to go.I don't know if
I agree.  Although I support your point that what matters to customers
is reliability and what you see... these kinds of ratios are useful for
giving us some kind of quantitative feedback about how we're doing.
These numbers are better than nothing.  The important question is: What
ought to be the perspective of a software producer, and bug counts are a
piece of information that can be important.

Being in charge of a testing group, if you take snapshots of how many
bugs are reported that your testing team didn't find (after 6 months)
compared to how many bugs your team found.  It's also important to see
how many bugs customers found that your triage process chose not to fix.

> Comment from audience - You also have to keep bugs in context.  There
is a huge difference in severity and thus importance of bugs.

> Comment from audience - A lot of times there are bugs put against the
process itself, not on the product.

> Comment from audience - I use bugs as a negotiating tool to leverage
against management.  We can estimate how long it will take to run test
cases based on the fact that our developers probably did x amount of
damage in the latest build

You can predict how much longer you have to continue developing your
software based on how many bugs there are to fix.

*** Extra Questions From The Audience

> Question from the audience - There's been a mention of extreme
programming.  What do the experts think this might effect in test?

The thing that I like about it is that it makes tester and equal part of
the development process.  I strongly support it.  Their strategies for
test design are superficial and I think they should go a lot further.
The slogan is continuous integration and relentless testing.

By moving in parallel, you are more able to achieve cycle times needed
for the Internet

If your test group does too well, developers may not be encouraged to
improve their process.  10% of a tester's time during a year is spent
during process improvement.  For development it was 0.8%, which is
scary.  Moving to extreme programming may encourage developers to do
more process improvement.

> Question from the audience - Has anyone come across metrics for HTML
or Java Script?

Not aware of any.  Maybe if you had function points in the Java or HTML
code you could make a comparison

This is a new area that may be emerging.  I am aware of syntax checkers.

      ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 9000 subscribers
worldwide.  To have your event listed in an upcoming issue E-mail a
complete description and full details of your Call for Papers or Call
for Participation to "".

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the QTN issue date.  For example,
  submission deadlines for "Calls for Papers" in the January issue of
  QTN On-Line should be for February and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters; QTN disclaims any responsibility for their content.

STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR logo are
trademarks or registered trademarks of Software Research, Inc. All other
systems are either trademarks or registered trademarks of their
respective companies.

          -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to CANCEL a current subscription, to CHANGE an
address (a CANCEL and a SUBSCRIBE combined) or to submit or propose an
article, use the convenient Subscribe/Unsubscribe facility at:


Or, send Email to "" as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:


   TO UNSUBSCRIBE: Include this phrase in the body of your message:


   NOTE: Please, when subscribing or unsubscribing, type YOUR  exactly and completely.  Note that unsubscribes that don't
   match an email address on the subscriber list are ignored.

	       Software Research, Inc.
	       1663 Mission Street, Suite 400
	       San Francisco, CA  94103  USA
	       Phone:     +1 (415) 861-2800
	       Toll Free: +1 (800) 942-SOFT (USA Only)
	       Fax:       +1 (415) 861-9801
	       Web:       <>

                               ## End ##