sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======           February 2001             =======+

QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to Subscribers
worldwide to support the Software Research, Inc. (SR), TestWorks,
QualityLabs, and eValid user communities and other interested parties to
provide information of general use to the worldwide internet and
software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the entire
document/file is kept intact and this complete copyright notice appears
with it in all copies.  Information on how to subscribe or unsubscribe
is at the end of this issue.  (c) Copyright 2003 by Software Research,


                         Contents of This Issue

   o  Quality Week 2001 Tutorials and Technical Program Announced

   o  Word Play, By Ann Schadt

   o  Software Engineering Learning using Case Studies, by Lawrence
      Bernstein and David Kappholz

   o  eValid Updates

   o  ICSE 2001: 23rd International Conference on Software Engineering

   o  A Current Senate Proposal (Heard Going Around)

   o  QTN Article Submittal, Subscription Information


      Quality Week 2001 Tutorials and Technical Program Announced

We are proud to announce the complete technical program for QW2001, set
for 29 May 2001 through 1 June 2001 in San Francisco.

QW2001 consists of a day of full-day and half-day Pre-Conference
tutorials by world-renowned experts, and a three and one-half day, six
track Technical Conference, two-day exhibit.  The event is followed by
four Post-Conference workshops.

Complete details are at the conference website:


The Tutorial, Conference, and Workshop speakers are listed below.
Session ID tags are given in ()'s after each title.

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

               A D V I S O R Y   B O A R D  M E M B E R S

Conference papers are reviewed and selected by an international team of
experts.  They put the "Quality" in Quality Week!

            James Bach (Satisfice) * Boris Beizer (Analysis)
        Bill Bently (Mu_Research) * Antonia Bertolino IEI/CNR)
            Robert Birss (WebTV) * Nick Borelli (Microsoft)
        Rita Bral (SR/Institute) * Dr. Cem Kaner (Univ. Florida)
              Taz Daughtrey (ASQC/SQP) * Tom Drake (ICCI)
      Elizabeth Hendrickson (QualityTree) * Bernard Homes (Tessco)
             Bill Howden (UC/San Diego) * Andre Kok (CMG)
         Ruth Levy (CRIM) * Peter Liggesmeyer (Univ. Potsdam)
       Glen Meisenheimer (SiteROCK) * Atif Memon (U. Pittsburgh)
           Edward Miller (SR/Institute) * Kris Mohan (Intel)
                John Musa (Consultant) * Jens Pas (I2B)
               Greg Pope (AZOR) * Linda Rosenberg (NASA)
        Stefan Steurs (Eurocontrol) * Keith Stobie (BEA Systems)
         Dolores Wallace (Wallace Systems) * Mark Wiley (nCUBE)
                         Denise Woit (Ryerson)

                           T U T O R I A L S

           Tuesday, 29 May 2001, 8:30 - 12:00 -- AM Tutorials

Mr. Tom Gilb (Result Planning Limited) "Software Inspection For The
Internet Age: How To Increase Effect And Radically Reduce the Cost (A1)

Mr. Erik Simmons (Intel Corporation) "Writing Good Requirements (B1)"

Dr. Norman Schneidewind (Naval Postgraduate School) "A Roadmap To
Distributed Client-Server Software Reliability Engineering (C1)"

Mr. Gualtiero Bazzana (ONION, S.P.A) "Web Testing Techniques and Tools
(D1) (D2)"

Mr. Robert Sabourin (AmiBug.Com) "Getting Started -- Stressing Web
Applications: Stress Early -- Stress Often (E1)"

Mr. Ross Collard (Collard and Co.)  "Test Estimating (F1)"

Mr. Thomas Drake (Integrated Computer Concepts, Inc ICCI) "The Quality
Challenge For Network Based Software Systems (G1)"

           Tuesday, 29 May 2001, 1:30 - 5:00 -- PM Tutorials

Mr. Tom Gilb (Result Planning Limited) "Software Inspection For The
Internet Age: How To Increase Effect And Radically Reduce the Cost (A1)

Mr. Bill Deibler (Software Systems Qaulity Consulting) "Making the CMM
Work: Streamlining the CMM for Todays Projects and Organizations (B2)"

Dr. John Musa (Consultant) "More Reliable Software Faster And Cheaper

Mr. Gualtiero Bazzana (ONION, S.P.A) "Web Testing Techniques and Tools
(D1) (D2)"

Dr. Edward Miller (eValid, Inc.)  "Client-Side WebSite Testing (E2)"

Dr. Cem Kaner (Florida Institute of Technology) "Teaching Testing: A
Skills-Based Approach (F2)"

Mr. Ed Kit (SDT Corporation) "Establishing a Fully Integrated Test
Automation Architecture (G2)"

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                   T E C H N I C A L   P R O G R A M

                            KEYNOTE SESSIONS

Ms. Lisa Crispin (iFactor-e) "The Need For Speed: Automating Functional
Testing In An eXtreme Programming Environment (QW2000 Best Presentation)

Mr. Dave Lilly (SiteROCK Corporation) "Internet Quality of Service
(QoS): The State Of The Practice (1P2)"

Dr. Linda Rosenberg (GSFC NASA) "Independent Verification And Validation
Implementation At NASA (5P1)"

Dr. Dalibor Vrsalovic (Intel Corporation) "Internet Infrastrucure: The
Shape Of Things To Come (5P2)"

Mr. Hans Buwalda (CMG) "The Three "Holy Grails" of Test Development
(...adventures of a mortal tester...) (10P1)"

Mr. Ed Kit (SDT Corporation) "Test Automation -- State of the Practice

Mr. Thomas Drake (Integrated Computer Concepts, Inc ICCI) "Riding The
Wave -- The Future For Software Quality (10P3)"

                            QUICKSTART TRACK

Mr. Kent Beck (Author) "Extreme Programming Explained (2Q)"

Mr. Tom Gilb (Result Planning Limited) "Planguage: A Defined Language
for Clearer Requirements and Design (3Q)"

Mr. Greg Clower (SDT Corporation) "Establishing a Wireless
Telecommunication Test Automation System (4Q)"

Ms. Jeanette Folkes (Modem Media) "WebTesting 101 (6Q)"

Mr. James Bach (Satisfice Inc) "High Accountability Exploratory Testing

Mr. Robert Sabourin (AmiBug.Com) "The Effective SQA Manager: Getting
Things Done (8Q)"

Dr. John Musa (Consultant) "How Will The Internet Affect Software
Quality Practice? (Panel Discussion) (9Q)"

                            TECHNOLOGY TRACK

Tobias Mayer (eValid, Inc.)  "InBrowser WebSite Testing: The Client-Side
Approach (2T1)"

Dr. Nancy Eickelmann & Mr. Allan Willey (Motorola Labs) "An Integrated
System Test Environment (2T2)"

Mrs. Manjula Madan (Philips Software Centre, Bangalore) "Our Experiences
In Defect Reduction Using Orthogonal Defect Classification Methodology

Dr. Mark Blackburn, Robert Busser, Aaron Nauman (Software Productivity
Consortium) "Model-based Approach To Security Test Automation (3T2)"

Prof. Warren Harrison (Portland State University) "A Universal Metrics
Repository (4T1)"

Mr. Suresh Nageswaran (Cognizant Technology Solutions CTS) "Test Effort
Estimation Using Use Case Points (UCP) (4T2)"

Dr. Rainer Stetter "Test Strategies for Embedded Systems (6T1)"

Mr. Keith Stobie (BEA Systems, Inc.)  "Automating Test Oracles and
Decomposability (6T2)"

Mr. Don Cohen (Princeton Softech) "Requirements For A Comprehensive
Testing Environment (7T1)"

Mr. James Lyndsay (Workroom Productions) "The Importance of Data In
Functional Testing (7T2)"

Mr. J.D. Brisk (Exodus Communications) "Peer-to-Peer Computing: The
Future Of Internet Performance Testing (8T1)"

Mr. Erik Simmons (Intel Corporation) "Quantifying Quality Requirements
using Planguage (8T2)"

Mr. Greg Berger (Lawson Software) "Creating A Tool-Independent Test
Environment (9T1)"

Mr. Daniel Blezek, Mr. Timothy Kelliher, Mr. William Lorensen, Mr. James
(GE Corporate R&D) "The Frost Extreme Testing Framework (9T2)"

                           APPLICATIONS TRACK

Mr. James Tierney (Microsoft) "Getting Started With Model-Based Testing

Mr. Klaus Olsen ( "Using The W Model To Institutionalize
Inspections, And Improve Knowledge Transfers (2A2)"

Mr. Ralph Dalebout (IBM Printing Systems Division) "Beta Testing With
Rapid Development (3A1)"

Mr. Roger Records (Boeing Commercial Airplane Group) "Assuring Quality
In Outsourced Software (3A2)"

Mr. Henk Keesom , Mr. John Musa (Ortho-Clinical Diagnostics) "Using
Internal Product Development Test Data To Calculate Software (4A1)"

Mr. Erik Simmons (Intel Corporation) "Product Triage: A "Medical"
Approach To Predicting And Monitoring Product (4A2)"

Mr. Steve Whitchurch (Mentor Graphics Corp.)  "Trials and Tribulations
Of Testing a Java/C++ Hybrid Application (6A1)"

Mr. Juris Borzovs, Anda Adamsone, Martins Gills, Sanda Linde, Janis
Plume (Riga Information Technology Inst.)  "Software Testing in Latvia:
Lessons Learned (6A2)"

Dr. Holger Schlingloff, Jan Bredereke (Technologie-Zentrum Informatik)
"Specification Based Testing Of the UMTS Protocol Stack (7A1)"

Mr. Michael Jones, Assistant Professor (TransactPlus/WIU) "High
Availability Testing (7A2)"

Mr. Vince Budrovich (ParaSoft Corporation) "Increasing The Effectiveness
Of Load Testing: Unit-Level Load Testing (8A1)"

Mr. Scott Trappe (Reasoning, Inc.)  "Building Better Software Code:
Finding Bugs You Never Knew You Had (8A2)"

Dr. Harmen Sthamer, Joachim Wegener (DaimlerChrysler AG) "Evolutionary
Testing Of Embedded Systems (9A1)"

Mr. Hung Nguyen (LogiGear Technology) "The Design and Implementation of
a Flexible, Reusable and Maintainable Automation Framework (9A2)"

                             INTERNET TRACK

Mr. Adrian Cowderoy (ProfessionalSpirit Ltd) "Quality in a Dotcom
Startup -- Fact or Fiction? (2W1)"

Mr. Todd Hsueh (IBM) "Innovative Web Test Process & Control Tool (2W2)"

Ms. Nancy Landau (Alltel Technology Services) "Performance Testing
Applications In Internet Time (3W1)"

Mr. Steve Splaine (Splaine & Associates) "Modeling The Real World For
Load Testing Web Sites (3W2)"

Mr. Nikhil Nilakantan, Ibrahim K. El-Far (Florida Tech) "Why Is API
Testing Different (4W1)"

Mr. Phil Hollows (RadView Software) "Best Practices in Web Performance
Testing: "Not Optional Anymore!" (4W2)"

Mr. Mark Johnson (Cadence Design Systems) "How Are You Going To Test All
Those Configurations? (6W1)"

Mr. Rakesh Agarwal, Santanu Banerjee, Bhaskar Gosh (Infosys Technologies
Ltd) "Estimating Internet Based Projects: A Case Study (6W2)"

Mr. Bhushan Gupta, Steve Rhodes (Hewlett-Packard Co.)  "Adopting A
Lifecycle For Developing Web Based Applications (7W1)"

Mr. Eric Patel (SourceGate Systems, Inc.)  "Rapid SQA: Web Testing At
The Speed Of The Internet (7W2)"

Dr. James Helm (Univ. of Houston Clear Lake) "Web-Based Application
Quality Assurance Testing (8W1)"

Ms. Kim Davis, Mr. Robert Sabourin (My Virtual Model Inc.)  "Exploring,
Discovering and Exterminating Bug Clusters In Web Applications (8W2)"

Ms. Patricia Humphrey ( "Quality Assurance and the Internet
Site - How To Effectively Hit a Moving Target (9W1)"

Mr. Prashant Lambat, Annuradha Bhide (Neilsoft Ltd.)  "Double Byte
Compliance Testing (9W2)"

                            MANAGEMENT TRACK

Mr. Scott Jefferies (Technology Builders, Inc.)  "A Requirements-Based
Approach To Delivering E-Business And Enterprise Applications (2M1)"

Mr. Robert Benjamin, Ruth Pennoyer, Karen Law (Spherion Corporation)
"Pre-Defining Success: Incorporating e-Metrics Into Business And
Technical Requirements For Web And e-Business Solutions (2M2)"

Mr. Timothy Kelliher, Daniel Slezek, William Lorensen, James Miller (GE
Research & Development) "Six-Sigma Meets Extreme Programming: Changing
the Way We Work (3M1)"

Mr. Elli Georgiadou, Naomi Barbor (University of North London)
"Investigating The Applicability Of The Taguchi Method To Software
Development (3M2)"

Mr. David Fern (Micros Systems Inc.)  "How Testers Can And Should Drive
Development Cycles (4M1)"

Dr. Cem Kaner (Florida Institute of Technology) "Managing The Proportion
Of Testers To Developers (4M2)"

Mr. Geert Pinxten (I2B) "The Extended Product Quality Model (6M1)"

Ms. Johanna Rothman (The Rothman Consulting Group) "Using Requirements
To Create Release Criteria (6M2)"

Mr. Michael Ensminger (PAR3 Communications) "Walk & Stagger Through
Review Process (7M1)"

Prof. Warren Harrison, David Raffo, John Settle (Portland State
University) "Process Improvement As A Capital Investment (7M2)"

Ms. Sandy Sweeney (Compuware Corporation) "Risky Business -- Adding Risk
Assessment To The Test Planning Process (8M1)"

Mr. Kamesh Pemmaraju (Cigital, Inc.)  "Software Risk Management (8M2)"

Mr. Michael Hillelsohn (Software Performance Systems) "Organizational
Performance Egineering: Quality Assurance For The 21st Century (9M1)"

Mr. Brian Lawrence (Coyote Valley Software) "Choosing Potential
Improvements -- Comparing Appoaches (9M2)"

                             PANEL SESSIONS

Mr. Brian Lawrence (Coyote Valley Software) "How DO You Test Internet
Software? (Panel Session) (4P)"

Mr. Nick Borelli (Microsoft) "Ask The Experts (Panel Session) (8P)"

        - - - - - - - - - - - - - - - - - - - - - - - - - - - -

                       POST CONFERENCE WORKSHOPS

Dr. Cem Kaner (Florida Institute of Technology) "Developing The Right
Test Documentation (W1)"

Mr. John Paul (Minjoh Technology Solutions) "Automating Software
Testing: A Life-Cycle Methodology (W2)"

Mr. Robert Sabourin (AmiBug.Com) "Bug Priority And Severity (W3)"

Ms. Johanna Rothman, Elizabeth Hendrickson (The Rothman Consulting
Group) "Grace Under Pressure: Handline Sticky Situations in Testing


                               Word Play
                             by Ann Schadt

[The rules of this game are as follows:  take a word, add a letter and
then give the definition of the newly formed word.]

Overbatim: more than what was said.

Glibido: shallow yearnings.

Flashionable: describes a celebrity's glittery, physics-defying gown.

Gapocalypse: the dire result if your teenager doesn't get the outfit all
the other kids are wearing.

Listerati: people who know all the bestsellers without actually having
read them.

Computter: to idle away time on-line.

Funerall: mass burial.

Subwary: fearful of public transit.

Origasmi: pornographic paper-folding.

Spinprick: a mildly negative attack on a political opponent.

Skuldruggery: dealing in contaminated narcotics.

Pepidemic: way too many perky people on TV.

Puniversity: a really, really, really small college.

Billiterate: how we all feel at tax time.

Noctopus: a date who sprouts eight arms as soon as night falls.

Flinguine: a romance conducted in an Italian restaurant.

Servoices: the din of the waiters' idle chatter while you wait for
someone to take your order.

Foreslight: getting your witty insult in first.

Poligarchy: household ruled by a bossy parrot.

Loutcry: the clamour of a distressed soccer crowd.

Pregret: remorse about something you have not done yet.

Squibble: an argument between nitpickers.

                     (To Be Continued Next Issue!)


            Software Engineering Learning using Case Studies


           Professors Lawrence Bernstein and David Klappholz
              Stevens Institute of Technology, Hoboken, NJ

Software projects often fail because the staff lacks Software
Engineering education or when they had it they fought it.  To compound
the problem they don't accept Software Best Practices.  Our challenge is
to overcome the natural biases of software professionals and computer
science students.  Reading case histories of failed software tends to
convince some students of others' stupidity.  While other students
intellectually accept the existence of the problems, just reading about
them does not convert many at the 'gut level.'  At the gut level one
sits up, takes notice, and does something different.


Our approach is to force students to live through specific case
histories, each one chosen to get across a small number of important
issues.  The method works.  Students internalize the software
engineering lessons and follow best practices to avoid the traps they

Here is how the approach works.

First, a set of software process issues is selected.  Here are the ones
we chose for our first live-thru case history:

   1. The need to have close customer/user relations

   2. The need for up-to-date documentation throughout the life of the

   3. The need to identify risks and to develop contingency plans

   4. The need to account for human foibles

Second, a case history based on a project facing these challenges is
chosen.  Students are not given the entire case history up front;
rather, they are given the same problem/project as the actual developers
who executed the case history faced.  They are given no more information
about the problem/project than were those developers.  The project
information is simplified to ease understanding.


Computer Science is the study of the technology (State-of- the-Art)
involved in the development of computer software.  As it is usually
taught in a post-secondary setting, Computer Science deals with
"programming in the small," i.e., one-person or few-person software
projects.  Software Engineering, on the other hand, is the study of the
method or process (State-of- the-Practice) whereby production software
is developed -- "programming in the large."  State-of- the- Practice
includes both engineering practices and project management or group
dynamic processes.  Many post-secondary programs in Computer Science
offer a Software Engineering or Senior Project course as a capstone.

Because of the very different natures of technology on the one hand and
method/process on the other, and because computer science students are
typically technology-oriented and method/process-averse, the typical
Software Engineering course reaches far fewer future software developers
than suits the best interests of either the students themselves or the
software industry at large.  We have developed a novel instructional
method, the Live-Thru Case History method for addressing this problem,
have developed a first live-thru case history, and have used it
successfully in the first few weeks of a two-semester undergraduate
Software Engineering course.

The result was that students were shocked into an awareness of the
issues and how to deal with them in six weeks of two classes meetings a
week.  One class meeting was devoted to project meetings and the other
to lectures on Software Engineering topics including other case

                      Conducting the Case History

Because there would be only one live-thru case history in our Senior
Project course, we had to choose one that would achieve the greatest
effect in the limited time available.  We chose the case history of a
brief development project that one of the authors worked on, in 1985, as
a public service project.  The problem/project was that of automating an
elementary school library's manual system for generating overdue-book

The class of forty students was divided randomly into four equal-size
development teams.  Students were given the same details, as were the
original software developers in the case history.  The instructor played
the role of the customer, the school librarian, and was available to
students, to respond to questions, both in class and by e-mail.
Students were told that the customer would evaluate their work, exactly
as work is evaluated in the real world.


As is frequently the case in real software development projects, the
overdue book notice project had a hidden requirement that was so obvious
to the customer that she failed to mention it; it is that overdue
notices must be sorted, first by teacher name, then, for each teacher by
class, and, finally, within each class by student's family name.  The
system analyst rejected the real software system when she first saw it.
The original developers failed to elicit the hidden make-or-break
requirement, and thus failed to satisfy it.  Each of the student teams
fell into this same trap and they learned the lesson of the need to find
any 'hidden requirements.'

The need for high-quality documentation and for contingency planning
were motivated for students by the classroom equivalent of the real-
world phenomenon of loss of staff members due to illness, death,
relocation, etc.  At the midpoint of the project, the student from each
team judged, by the instructor, to be the team's strongest developer and
another, randomly chosen, team member were removed from the team and
re-assigned to a different team.  To evaluate the success of the staff
change on students' approach to software engineering, after the case
study project students were then asked to describe what they would have
done differently had they understood that the real-world conditions
under which they would be operating included the possibility of staff
changes.  About three quarters of the students mentioned the importance
of up-to-date documentation, and about twenty percent had developed
insight into appropriate utilization of staff, including the use of
"understudies" and of preparation for the incorporation of new team
members and thus demonstrated that they had learned the value of these

Evaluation of how well the students internalized the need for solid
requirements engineering was done the end of the live-thru case history.
A written exam was based on another case history.  This case history
included a more difficult requirements engineering problem than that of
the overdue book notice project.  About three quarters of the students
showed that they had mastered the notion of hidden requirements, and
about one third showed that they had achieved reasonable competence in
requirements engineering; about ten percent showed extremely keen
insight into the problem.


The innovative process of live-through case histories is more effective
than the traditional teaching of the Software Engineering course.  In
the past students were given lectures, homework and exams based on a
well-respected Software Engineering text.  Then they were asked to
develop a project.  When they approached the project they could not
readily apply the techniques they learned.  Once they understood the
need for the processes they re-learned them as they tried to apply them.


The authors are asking those teaching software engineering to use these
case histories in their courses and report on the results.  Materials
are available at web site along with a complete paper
describing the live-thru approach in detail.  Please participate in
gathering data to support or refute the claims in this paper.  It is our
intent to use the experience of instructors in several venues to make
anecdotal conclusions more meaningful and perhaps statistically
significant.  We invite those who agree with us to join a consortium for
the purpose of creating additional case histories and helping to refine
the process.

Editors Note: The authors may be reached at  and


                    eValid Updates and Improvements

Here is a quick summary of the latest changes and additions in eValid.


  > Newest Build: There's a new build available for eValid (Ver. 2.1
    dated 15Feb01).  Be sure to get it if you haven't already.

  > AutoDemo Playback Scripts.  This new feature permits eValid playback
    scripts to be loaded and executed direct from the eValid browser.
    We've set up 15 example playback scripts that illustrate key eValid
    features.  Go to the eValid homepage and click on the link, or go
    direct to:

    Simply click on the script you want to play back and you will see
    eValid take you through three illustrative playback sequences, or
    through any one of our 12 Tutorial sequences.

  > Internal Performance Improvements:  The latest eValid build includes
    a number of internal changes that enhance reliability and minimize
    performance degradation problems when you are using the LoadTest
    options.  We have heard that some users experience significant
    degradation after a few hundred repetitions of playbacks when doing
    LoadTests with more than 10 browsers wide.

  > Special Microsoft Patches for Performance Problems:  There are some
    HotFix modifications that deal with special problems that some
    versions of eValid may have when executing on some versions of
    Windows 2000.  (These are documented in the User Manual on the
    LoadTest Hints page.)

    HotFix information is available direct from Microsoft.  In certain
    cases these HotFixes may already be included in one or more Service
    Packs so their use may not be necessary depending on your level of

      Q271976   This HotFix deals with a problem in which there is
                performance degradation when the Heap is fragmented.

      Q275455   This HotFix deals with a problem in which there is a
                memory leak when calling between COM components.

    You can get these fixes direct from Microsoft.  We're told that both
    of them will be appearing in an upcoming service pack for Windows


  > New Default Settings:  We have changed some of the default settings
    in the Preferences to make it easier for first-time eValid users to
    get a feel for the product.  Only initial values have been changed;
    no settings have been removed.

  > Tutorials Available:  There are 12 different step-by-step Tutorials
    on the eValid WebSite that you can use to come up to speed or to
    brush up your eValid technique.  You can see them in PowerPoint or
    have them played back automatically by eValid.

    Click "Help > Documentation > Tutorials" and then select the topic
    of interest.

  > Synchronize on Text String Feature:  The "eValid > Validate > &
    Synchronize > Text String" has been added to the set of validation
    functions.  On playback eValid waits until the selected text string
    is found on the page you indicated during recording.

  > PlayValue Options:  More PlayValue options (13 in total) are now
    available to facilitate playback scripting in batch mode.  These are
    described in the User Manual Script Definition section.

  > Extrinsic Commands:  A number of new edit-only commands have been
    added that make it possible to invoke modal-popups, open sub-
    windows, etc.  These are described in the User Manual Script
    Definition section.


  > Refreshing Your eValid Key:  Your existing eValid license key should
    work fine with the new build.  If your key has expired you'll need
    to get an update from:

  > Download Instructions:  The latest eValid Ver. 2.1 build -- dated
    15Feb01 -- is now available for download.  Everyone using eValid
    _should_ download this version because it includes all the latest
    functionality and incorporates many recent performance improvements.
    You don't have to use the download signups if you don't want to --
    you can use this ftp address:

  > Your Comments and Suggestions:  Please let us know if you think of
    something that eValid ought to have that it doesn't or if you
    encounter a problem for which there's no work-around.  Send all
    comments to .

We believe that eValid's InBrowser(tm) implementation represents very
significant value in terms of ease of use, capabilities, and flexibility
vs. cost.  We hope you are as excited about this new technology and its
applications as we are!


    ICSE 2001: 23rd International Conference on Software Engineering

                         Call for Participation

We are pleased to announce that ICSE 2001 ADVANCE PROGRAM is now on-line
at the conference web site:

ICSE 2001, the premier conference for software engineering, will feature
the latest inventions, achievements, and experiences in software
engineering research, practice, and education, and will give
researchers, practitioners, and, educators the opportunity to present,
discuss, and learn.

The ICSE 2001 Software Engineering Week, May 11-20, 2001 consists of the
main ICSE conference and over 50 tutorials, workshops, collocated
conferences, and symposia. The conference venue is the Westin Harbour
Castle overlooking Lake Ontario in downtown Toronto, with restaurants,
theaters, shopping and plenty of other activities.

The main ICSE 2001 program includes 47 technical papers, eight case-
study reports, six education papers, an invited industry track, nine
formal research demonstrations, and four panels. The program also
contains six plenary sessions with outstanding invited keynote speakers.
The main ICSE 2001 program also contains two new features: Challenges
and Achievements in Software Engineering (CHASE), in which each session
offers both research and industrial views of the same topic; and
Frontiers of Software Practice (FoSP), which provides mini-tutorials on
new and promising software technologies. Throughout the conference,
there are also exhibits, posters, and informal research demonstrations.
Finally, the conference features three casual receptions with great food
and entertainment to give all an opportunity to meet and mingle with old
and new friends.

Prior to the main ICSE 2001 program, there are 22 tutorials (full day
and half day) on a variety of topics and 18 workshops that offer an
informal forum for interaction. There are also three special symposia:
the David L. Parnas Symposium, the New Software Engineering Faculty
Symposium, and the Doctoral Symposium. Finally, both prior to and
immediately following the main ICSE 2001 program, there are four
collocated conferences: International Workshop on Program Comprehension
(IWPC 2001); Engineering for Human-Computer Interaction (EHCI 2001);
Symposium on Software Reusability (SSR 2001); and Spin Workshop on Model
Checking of Software (SPIN 2001).

We look forward to seeing you at ICSE 2001!

                        Hausi Muller
                        General Chair

                  Mary Jean Harrold & Wilhelm Schafer
                           Program Co-Chairs


             A Current Senate Proposal (Heard Going Around)

Senators William B. Spong of Virginia and Hiram Fong of Hawaii sponsored
a bill recommending the mass ringing of church bells to welcome the
arrival in Hong Kong of the U.S. Table Tennis Team after its tour of
Communist China. The bill failed to pass, cheating the Senate out of
passing the...

          "Spong-Fong Hong Kong Ping Pong Ding Dong Bell Bill"

      ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 9000 subscribers
worldwide.  To have your event listed in an upcoming issue E-mail a
complete description and full details of your Call for Papers or Call
for Participation to .

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the QTN issue date.  For example,
  submission deadlines for "Calls for Papers" in the March issue of QTN
  On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the opinions
of their authors or submitters; QTN disclaims any responsibility for
their content.

STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR logo are
trademarks or registered trademarks of Software Research, Inc. All other
systems are either trademarks or registered trademarks of their
respective companies.

          -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to CHANGE an
address (an UNSUBSCRIBE and a SUBSCRIBE combined) please use the
convenient Subscribe/Unsubscribe facility at:


As a backup you may send Email direct to  as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:

   TO UNSUBSCRIBE: Include this phrase in the body of your message:

Please, when using either method to subscribe or unsubscribe, type the
 exactly and completely.  Requests to unsubscribe that do
not match an email address on the subscriber list are ignored.

	       Software Research, Inc.
	       1663 Mission Street, Suite 400
	       San Francisco, CA  94103  USA
	       Phone:     +1 (415) 861-2800
	       Toll Free: +1 (800) 942-SOFT (USA Only)
	       Fax:       +1 (415) 861-9801
	       Web:       <>