sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +===================================================+
         +=======    Quality Techniques Newsletter    =======+
         +=======              April 2002             =======+
         +===================================================+

QUALITY TECHNIQUES NEWSLETTER (QTN) is E-mailed monthly to
Subscribers worldwide to support the Software Research, Inc. (SR),
TestWorks, QualityLabs, and eValid user communities and other
interested parties to provide information of general use to the
worldwide internet and software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the
entire document/file is kept intact and this complete copyright
notice appears with it in all copies.  Information on how to
subscribe or unsubscribe is at the end of this issue.  (c) Copyright
2002 by Software Research, Inc.

========================================================================

                       Contents of This Issue

   o  QW2002 Call For Papers/Presentations, Conference Details

   o  Modeling The Web Report Available

   o  eValid -- A Comprehensive WebSite Test Environment

   o  An Elusive Diagnosis, by Danny Faught

   o  New Think Tank to Address National Software Issues

   o  Difficult Questions in a Difficult Time, by Edward Miller

   o  7th International Conference on Reliable Software (Ada-Europe 2002)

   o  QTN Article Submittal, Subscription Information

========================================================================

                QW2002 Call for Papers/Presentations
                    <http://www.qualityweek.com>

Don't miss the opportunity to share your knowledge and expertise!

You're invited to participate in the premier software and internet
quality conference -- Quality Week 2002, San Francisco, 3-6
September 2002.

The deadline for paper/presentation submissions is quickly
approaching!  Please use the submission form:

  <http://www.qualityweek.com/QW2002/speaker-data.phtml>

and send your extended abstract or session proposal by email to
<qw@qualityweek.com>.

--------------------------------------------------------------------

                           QW2002 Details
                           ^^^^^^^^^^^^^^

QW2002 is the 20th in the continuing series of International
Internet & Software Quality Week Conferences that focus on advances
in software test technology, reliability assessment, software
quality processes, quality control, risk management, software safety
and reliability, and test automation as it applies to client-server
applications and WebSites.

ABOUT QW2002's THEME: The Wired World...

Change is very rapid in the new wired world, and the wave of change
brought about by the Internet affects how we approach our work, and
how we think about quality of software and its main applications in
IT and E-commerce. QW2002 aims to tackle internet and related issues
head on, with special presentations dealing with changes in the
software quality and internet areas.

QW2002 OFFERS...

The QW2002 program consists of four days of mini-Tutorials, panels,
technical papers and workshops that focus on software and internet
test technologies. QW2002 provides the Software Testing and Web
Quality community with:

  > Real-World Experience from Leading Industry and Government
    Practitioners.
  > Quality Assurance and Test involvement in the development
    process.
  > Lessons Learned & Success Stories.
  > Latest Tools and Trends.
  > State-of-the-art information on software quality and Web
    methods.
  > Vendor Technical Presentations and Demonstrations
  > Carefully chosen 1/2-day and full-day tutorials from well-known
    technical experts.
  > Three-Day Conference, including Five Tracks: Technology,
    Web/Internet, Applications, Process/Management, Quick-Start.
  > Two-Day Vendor Show/Exhibition
  > Analysis of method and process effectiveness through case
    studies.
  > Over 80 Presentations
  > Meetings of Special Interest Groups and ad hoc Birds-Of-A-
    Feather Sessions.
  > Exchange of critical information among technologists, managers,
    and consultants.

QW2002 is soliciting 45 and 90 minute technical presentations,
tutorial proposals, quick-start proposals, and panel discussion
proposal, on all areas of internet and software quality, including
these topics:

        WebSite Monitoring
        E-Commerce Reliability/Assurance
        Application of Formal Methods
        Software Reliability Studies
        Client/Server Testing
        CMM/PMM Process Assessment
        Cost / Schedule Estimation
        Test Data Generation and Techniques
        Automated Inspection Methods
        Test Documentation Standards
        GUI Test Technology
        Integrated Test Environments
        Quality of Service (QoS) Matters
        WebSite Load Generation and Analysis
        Object Oriented Testing
        Test Management
        Process Improvement
        GUI Test Management
        Productivity and Quality Issues
        Real-Time Software
        New and Novel Test Methods
        Test Automation Technology and Experience
        WebSite Testing
        Real-World Experience
        Defect Tracking / Monitoring
        Risk Management
        Test Planning Methods
        Test Policies and Standards
        WebSite Quality Issues
        Test Outsourcing

IMPORTANT DATES:

        Abstracts and Proposals Due 30 April 2002
        Notification of Participation  15 June 2002
        Presentation Materials Due  15 July 2002

SUBMISSION INFORMATION...

Here are the steps for submitting material for QW2002:

 1. Prepare your QW2002 Abstract as an ASCII file, a Microsoft Word
    document, or in PostScript or PDF format.  Abstracts should be
    1-2 pages long, with enough detail to give members of QW2002's
    International Advisory Board an understanding of the final
    paper/presentation, including a rough outline of its contents.
    Send it by Email (as a MIME attachment) to:
    <qw@qualityweek.com>.

    Please include in your Email:
       a. A brief biographical sketch of each author.
       b. A photo of each author.
       c. The complete contact coordinates of the primary author.

 2. Fill out the Speaker Data Sheet on the QW2002 WebSite giving
    some essential facts about you and about your proposed
    presentation.  The URL for the form is:
    <http://www.qualityweek.com/QW2002/speaker-data.phtml>

 3. If you prefer, you may send material on a CD-ROM or floppy by
    postal mail to:

          Ms. Rita Bral
          Software Research Institute
          P.O.Box 420453
          San Francisco, CA 94141 USA

QW2002 AWARDS...

  > Best Paper Award: The winner receives wide recognition in the QA
    Community and receives a $1,000 grant.

  > Best Presentation Award: The winner is invited to present the
    winning talk at Quality Week Europe 2003 (QWE2003) set for March
    2003 in Brussels, BELGIUM, European Union.

For complete information on the QW2002 Conference, e-mail your
request to <qw@qualityweek.com>.

Prospective product/service exhibitors should contact the QW2002
team early because Expo space is strictly limited.


========================================================================

                 Modeling The Web Report Available.
        Forwarded by Steve Lawrence <lawrence@necmail.com>.

NEC has released a new study of the web including a new model that
allows predicting and analyzing competition within communities
(e.g., different e-commerce categories, scientists, newspapers,
etc.) The lead author is Dr. David Pennock.

If interested, you can find the news release at
<http://www.modelingtheweb.com/release.html>.

Sample analysis of e-commerce categories and more details can be
found at: <http://www.modelingtheweb.com/>


========================================================================

         eValid -- A Comprehensive WebSite Test Environment

One of the main criteria used in build eValid was to have a system
focused specifically on all aspects of WebSite testing -- entirely
from the client's point of view.  eValid is able to do everything
needed for client-side analysis, testing, validation & verification,
timing/tuning and loading of a WebSite.

The eValid WebSite Quality Assurance and Testing Solution available
today for Windows NT/2000/XP includes:

  > General:  Built into a fully functioning IE-equivalent browser,
    the eValid solution provides a unique client-side view of
    WebSite quality and performance..

    * All functions available from pull down menus -- a true point
      and click solution.
    * WebSite quality analysis measurements done entirely from the
      user's viewpoint.
    * Accurate, no-overhead (< 1 %) measurements.
    * Metrics popup details data about any page browsed.
    * Batch commands, interactive mode, multiple playback options.
    * Simplified feature-group licensing.
    * Always Up-To-Date documentation online.

  > Site Analysis: Complete site analysis a search spider built into
    the browser.

    * Programmable search process: time, length, depth of search.
    * Adjustable protocols, acceptance and rejection lists.
    * Broken/Unavailable link discovery.
    * Filter analysis performed on every page visited:  download
      time, page size, page age, metric properties, content (string
      match and regular expressions match).
    * 3D-SiteMap with page performance, size, dependence annotation.

  > Functional Testing/Validation:  Complete record/play functional
    & regression testing support with advanced object-oriented
    validation modes.

    * Validation modes for all features of pages.
    * Handles all protocols, JavaScript, applets, XML, HTTPS, etc.
    * Simple, editable script language.
    * Simple, database-ready response logfiles.
    * Adaptive playback to enhance life of recorded scripts.
    * Automatic script creation and generation capability (test data
      generation) with eV.Generate.
    * Test suite management facilities in eV.Manager.

  > Timing/Tuning:  Measurement and analysis of server performance
    within the browser.
    * Page timing and component tuning to 1.0 msec resolution.
    * Built-in charting applet to visualize results.
    * User cache control.

  > Loading:  LoadTest operation featuring automatically launched
    multiple independent browsers for totally realistic loads.
    * 100% real browser operation (no simulation).
    * Full scripting and scenario control supports realistic mixes
      of users.
    * Built-in charting applet to visualize results.
    * eVlite option for 1000's of navigation-only playback activity.

  > Monitoring:  Complete facilities for monitoring and measuring
    WebSites constantly and automatically using recorded or
    engineered scripts.

eValid is one WebSite QA/Testing tool suite, with one easy-to-use
interface, one focus, one supplier, and with complete feature
interoperability.  Check out eValid at <http://www.e-valid.com>.

========================================================================

               An Elusive Diagnosis, by Danny Faught

Software quality folks seem to enjoy reading about bug hunts, so
here is the tale of one particularly interesting bug that I hunted
down lately.  I'll go into some fairly low-level details, so my
non-technical readers may want to skip down to the lessons listed at
the bottom. I can't reveal which system I was working on, so I've
changed some of the details.

One of my clients uses a script called "webupdate" to download new
versions of the operating system software provided by an outsourced
development team. The script copies the software from the outsourced
vendor's web site and compiles it. One time when going through the
process, something didn't seem quite right to me. It ran about as
long as it usually does, but it seemed like the volume of output was
quite a bit shorter than usual. I hadn't saved the output from a
successful run before, so I couldn't know for sure. But I was pretty
sure that this error near the beginning of the output hadn't been
there before:

     % webupdate
     checking web site for new packages...
     : web file not found
     ...

This was followed by the voluminous output that is the result of a
successful build, and then a final message indicating that all was
well.  But none of the changes that were supposed to be in this
release of the system were there. This was a showstopper, despite
the misleading indication of success. I decided to investigate, to
see whether this was a problem with our environment or whether I
needed to report a bug to the vendor. Note that we ran on the
previous version of the system to get the new one, and we hadn't had
this kind of trouble with any previous version before.

Luckily, webupdate is a script rather than a compiled program, so
I'm able to easily examine the implementation. By looking at the
output right before and right after the mysterious ": web file not
found" error, I determine that the error most likely came from a
"getdir /" command. Okay, great.  What does getdir do? I can't find
any documentation for such a command, and I can't find a program by
that name. Oh! Half a screen up in the webupdate script is a
function named "getdir".

Okay, I'm getting closer to the source of the problem, but I don't
know how close yet. There is nothing in the function directly that
prints out the text "web file not found." But there are a few calls
to external programs.  The second one is preceded by an "echo
downloading $name..." message, which I didn't get in the output, so
I explore the first, which is a call to a program called "webls".

I find the webls program, which is also a script. Aha! There's the
telltale code which produces the error - "echo $sub: web file not
found". Hmmm, the $sub variable must be empty, since the error we
got starts with simply a lone colon. Maybe that's the problem. I
trace this variable through the program and find a pair of regular
expressions that munge the parameter that is passed into the script.
The parameter in this case is "/", indicating the root of the web
server, and the regular expressions erase this character. Looking at
the logic of the script, that seems to be okay.  That was a dead
end. So I turn my attention to a call to another external program:
"webget www.vendor.com/download/$sub 2>/dev/null".

It turns out that this one is a compiled program. I do have access
to the source code, but I really don't have any hints on where to
start looking. I notice that errors from this program are hidden
because of the "2>/dev/null" at the end of the line. So maybe
there's some valuable information that's getting thrown out. To
explore this, I run the webget command directly. Sure enough, I get
an error that I wasn't seeing before:

     can't reach www.vendor.com: The requested address is not valid.

That's odd - seems to be truncated. Maybe it's trying to say
"connection failed"? I look at the webget source code that prints
this message and it seems to be okay - looks like an operating
system bug is causing the error to be cut short. Dang, why can't we
hit just one bug at a time?

Well, now I'm stuck. I really don't know what's causing this error.
I run a web browser and try to get to the web site. It works just
fine. I go back to the webupdate script and scan the code, looking
for inspiration. Way down at the bottom, I found some help:

     } 2>&1 | tee /web/log/`cat /dev/time`

Matching the opening curly brace, I find that most of the script is
enclosed in the braces, and all of the output from that code is
copied to a file. I look under /web/log, and I find a stack of
files, each with a long string of numbers for its name. Great! I do
have the output from previous runs. I look at the file modification
times to identify the log from the failed run, and the last
successful run right before it. Sure enough, the failed run produced
several kilobytes less output. That confirms my previous assumption,
but I'm still no closer to knowing what's going on.

Also in my perusal of the webupdate script, I saw code that sets up
a web proxy. The code appears after the getdir call, so it doesn't
seem to be related. But I realize that the webget really does need
to know about a proxy server in order to get outside the firewall. I
check that the proxy file is set up properly, as it would be after
any full run of the webupdate script. Ah, so the proxy may not work
the first time webupdate is run on the system, but every run after
that should be okay. I verify that the webget call still fails the
same way.

Now is the point where inspiration strikes. Working with networks,
I've seen that a common problem is for DNS to stop working - DNS is
what converts symbolic Internet addresses like proxy.client.com to
numeric addresses like 12.34.56.78. I've also seen cases where
different applications used different DNS mechanisms, so one would
work where others would fail. By using the less attractive numeric
addresses, I can still do useful work if the DNS server is the only
thing that's broken. So I look up the numeric address of the proxy
server and edit the proxy setting from proxy.client.com:http to
something like 12.34.56.78:80 instead. I also convert the "http" to
the numeric port number "80" just in case. I run webget and shazam,
it works! I modify the webupdate script to use the numeric address
and port number, and the webupdate runs just fine - I check that the
log file is about the same size as the last successful run. (Well,
actually I had to track down two other unrelated problems, but I'll
spare you that story for now. Bugs often seem to appear in
clusters...)

To further isolate what's happening, I alternately set the proxy
address and the port back to their symbolic versions. I verify that
both must be numeric for the webget to work. I decide that I have
enough information to report a bug to the vendor, which I do.

It takes a while to get an answer (I couldn't justify setting the
bug at a high priority because I had a fairly easy workaround). The
response is that I need to be running the DNS server, and it would
have been run automatically if I had logged in to a particular
account on the system. I didn't know enough about the system to know
that the DNS server wasn't started every time the system boots. So
on the surface, it was a configuration problem after all, but there
is also some room for improvement in the system.

So what did I learn?

  * We need to document any requirements placed on the build
    environment, such as logging in to a particular account.

  * To make the build environment more robust, important steps like
    running the DNS server should be automated from the webupdate
    script, not in the setup files for a user account.

  * Throwing away program output, as is often done to clean up
    useless clutter, can also hide important error messages. Also,
    scripts should check for errors from the external programs they
    call so they don't erroneously report success. Use more than
    kind one check to verify that your software is working.

  * I was lucky to have access to the scripts and source code. The
    problem would have been far more difficult to diagnose if I had
    simply reported the "web file not found" error to the vendor.
    Plus, I wouldn't have been able to find a quick workaround.

  * Bootstrapping can be confusing. That worrisome code that seems
    to set up the proxy server too late actually sets it up for the
    next run, and it doesn't affect the current run, since we run on
    the previous version of the system to bootstrap the next
    version. This means that very first install of the system had to
    be done manually.

  * Keep copious logs. I was lucky that the webupdate programmer had
    the foresight to set this up.

  * You're often fighting against more than one problem. I actually
    found four separate problems that all cropped up at the same
    time, not counting robustness issues (were you keeping count?).

There's a big overlap between bug reporting and debugging. How do
you draw the line between the two? If I had been analyzing a test
failure rather than a build process failure, should I have just
filed a bug report based on the first symptom I saw? What if several
other tests were blocked because of the problem? Food for thought.

(c) Copyright 2002 by Danny R. Faught, proprietor of Tejas Software
Consulting, <http://tejasconsulting.com>.

========================================================================

         New Think Tank to Address National Software Issues
           <http://www.cnsoftware.org/news/launch.html>


Welcome to the Center for National Software Studies (CNSS) website!
The CNSS has been established as a 501 (c) 3 not-for-profit
organization whose mission is to elevate software to the national
agenda, and to provide objective expertise, studies and
recommendations on national software issues. Thanks to a generous
grant of seed money from Sybase, Inc., the CNSS has now entered its
"Startup" phase, the first of four planned phases of growth to bring
the CNSS to full operational status. I invite you to review our
Prospectus

<http://www.cnsoftware.org/prospectus.pdf> & Strategic Plan
(available for downloading) to better understand our goals and
objectives as well as our detailed plans for this exciting new
initiative. During this Startup phase, the CNSS will operate as a
virtual organization ("eCenter"), using the facilities of this
website and soliciting the help of volunteer participants. Our
vision for the website is that it will become the "go to" site for
all who are interested in the software issues of national importance
that the CNSS intends to address. An initial set of lead issues has
already been developed as reflected in the current pages.

This website is currently a "work in progress"; we need your help
during this early stage of development to achieve our vision. For
each issue, we want to provide a taxonomy to clearly define and
structure the issue; a bibliography of the key papers and documents
that address the issue and focus on relevant aspects; and a calendar
of future events that are relevant to the issue. Bringing all of
this information together in one location will provide a valuable
service to all those interested in these issues, as well as go a
long way toward achieving our vision for the site.

The CNSS concept of operations includes provisions for open dialogue
on the issues being addressed. As specific studies are identified,
study directors will assemble a balanced team of study participants
to conduct the study, and use the dialogue facilities of this site
to allow open review and comment before the study results are
finalized. You are invited to use these dialogue facilities to share
your thoughts and ideas on any of the issues currently posted.

To accomplish our objectives for the CNSS, we must seek appropriate
grants and tax deductible contributions to provide the needed
resources. In addition, we will engage in carefully selected
contracted studies that will not compromise our commitment to
objectivity and open publication of all results. Your assistance in
these areas will be welcomed.

Software is a pervasive and vital national resource that underlies
our economy as well as virtually all of the nation's critical
infrastructures. It is imperative that appropriate and enlightened
policies be established at the national level addressing all of the
key issues. I ask for your support in building a CNSS that can
contribute to that goal.

Potential participants are invited to submit their resumes to the
CNSS Executive Director, John Marciniak (marciniak@cnsoftware.org
<mailto:jmarciniak@cnsoftware.org> ). Your views, comments and
suggestions may also be addressed to me at
<salisbury@cnsoftware.org>.

Alan B. Salisbury, Ph.D.
President

========================================================================

              Difficult Questions in a Difficult Time
                          by Edward Miller

Last Fall, I asked QWE2002 speakers and QWE2002 Advisory Board
Members to suggest what they believed were the main concerns for the
times regarding software quality.  In reviewing the responses there
seemed to be some general themes hiding just beneath the surface ...
but the concerns were very specific.  There was a summary of them in
the December 2001 issue of QTN (see <http://www.soft.com/News/QTN-
Online/qtndec02.html>).

As good as those responses were -- and they were "right on" in many
cases -- it seems to me in the present business and technological
climate there are some even deeper questions that present some
unique challenges.  So, below are some really hard questions that, I
believe, need to be asked within the software quality community --
and might be the basis for some good discussions.

Not to even think about these things is to avoid reality, and that
can't be a good thing to do.

To think about them may bring better focus onto the real issues
facing the community.

> Quality Technology: Does there exist a sufficient technological
  base, plus good enough tools, to really deal with software quality
  issues.  After decades of tools and systems and approaches and
  methdologies, software systems continue to fail?  What is the
  software quality community doing wront?

> Issues on The Web:  Does quality really make a difference on the
  Web?  Can users get along just fine without any checking?  If so,
  why are there so many slow and unreliable WebSites?  Is the web
  infrastructure really secure?  Has quality technology done
  anything good for the Web?

> Industry Awareness of Quality Issues:  We all know the reality,
  that software quality may be great to talk about, but why is
  software quality nearly always the first casualty when budgets are
  tight?  Do contemporary software devlopment projects really need a
  software quality component?  If so, what?

> What About XP?  Does the Extreme Programming approach obviate the
  need for testing, and monitoring, and software quality control?
  Do the decades of software engineering R&D really mean so little?
  Why is XP so appealing?  Is it because XP reflects reality?

> What about CMM and SPICE and ISO/9000?  Do these standards and
  process oriented really work?  Are they worth the cost and
  trouble?  Do they improve anything?  If they don't, why are they
  still around?  Isn't good, solid thinking and careful
  implementation what this is really all about?  If so, what's the
  fuss?

> Security and Integrity?  How insecure is the internet
  infrastructure, really?  Are things actually at a crisis point?
  Or is it hyped up by a few for commecial gain?

What do YOU think?

Please send your responses -- and, of course, any additional "tough
questions" -- to me at <miller@sr-corp.com>.  I'll try to work up some
kind of a response for the next issue.

========================================================================

                  7th International Conference on
          Reliable Software Technologies - Ada-Europe'2002

                 June 17-21, 2002, Vienna, Austria

           http://www.ada-europe.org/conference2002.html

                        Organized by TU Wien
                      Sponsored by Ada-Europe
                  In cooperation with ACM SIGAda,
        ARC Seibersdorf Research, and Universiteit Salzburg

Ada-Europe organizes annual international conferences since the
early 80's.  This is the 7th event in the Reliable Software
Technologies series, previous ones being held at

- Montreux, Switzerland (1996)
- London, UK (1997)
- Uppsala, Sweden (1998)
- Santander, Spain (1999)
- Potsdam, Germany (2000)
- Leuven, Belgium (2001)

A 9-page program brochure is available on the conference web site.
Select "Program" to either view or download the PDF version or to
request a printed copy of the final brochure.

Program co-chairs

- Johann Blieberger, Technical University Vienna, Dept. of Computer-
  Aided Automation, Vienna, Austria, Blieberger@auto.tuwien.ac.at

- Alfred Strohmeier, Swiss Fed. Inst. of Technology Lausanne (EPFL),
  Software Engineering Lab, Switzerland, Alfred.Strohmeier@epfl.ch

Invited speakers

- Maarten Boasson, Quaerendo Invenietis bv & University of Amsterdam
  "Embedded Systems Unsuitable for Object Orientation"

- Mehdi Jazayeri, Technical University of Vienna
  "On Architectural Stability and Evolution"

- Rachid Guerraoui, Swiss Fed. Inst. of Technology Lausanne (EPFL)
  "Encapsulating Failure Detection: from Crash to Byzantine Failures"

- Alois Ferscha, University of Linz
  "Contextware: Bridging Physical and Virtual Worlds"

Tutorials

- Peter Amey & Rod Chapman: "SPARK, an Intensive Overview" (full day)
- Michael Gonzalez Harbour & Mario Aldea: "MaRTE OS: Bringing Embedded
  Systems and Real-Time POSIX Together" (full day)
- Matthew Heaney: "Principles Of Physical Software Design in Ada95"
  (half day)
- Matthew Heaney: "Implementing Design Patterns in Ada95" (half day)
- S. Ron Oliver: "CORBA 3 and CORBA for Embedded Systems" (full day)
- Joel Sherrill & Jiri Gaisler: "Using Open Source Hard- and Software to
  Build Reliable Systems" (half day)
- William Bail: "Cleanroom Software Engineering: An Overview" (half day)
- Currie Colket: "Exceptions - What You Always Wanted to Know About
  Exceptions, But Were Afraid To Ask" (half day)

Workshop

- "A Standard Container Library for Ada"
  Contact workshop co-chairs to participate:
  Ehud Lamm, ehudla@openu.ac.il & John English, je@brighton.ac.uk

Papers

- 28 papers on Embedded Systems, Case Studies, Real-Time Systems,
  High-Integrity Systems, Ada Language Issues, Program Analysis, Tools,
  Distributed Systems, Libraries and Bindings, and OO Technology

- Authors from 18 countries: Australia, Austria, Belgium, Canada, China,
  France, Germany, Greece, Israel, Japan, Malaysia, the Netherlands,
  Portugal, Russia, Spain, Switzerland, United Kingdom, USA

Contact:
  Dirk Craeynest            | Email Dirk.Craeynest@offis.be | Ada-Belgium
  Offis nv/sa - Aubay Group | Phone +32(2)725.40.25         | Ada-Europe
  Gatti de Gamondstraat 145 |       +32(2)729.97.36 (work)  | ACM SIGAda
  B-1180 Brussel, Belgium   | Fax   +32(2)725.40.12         | Team Ada

========================================================================
    ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------
========================================================================

QTN is E-mailed around the middle of each month to over 10,000
subscribers worldwide.  To have your event listed in an upcoming
issue E-mail a complete description and full details of your Call
for Papers or Call for Participation to <qtn@sr-corp.com>.

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should
  provide at least a 1-month lead time from the QTN issue date.  For
  example, submission deadlines for "Calls for Papers" in the March
  issue of QTN On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the
opinions of their authors or submitters; QTN disclaims any
responsibility for their content.

TRADEMARKS:  eValid, STW, TestWorks, CAPBAK, SMARTS, EXDIFF,
STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR logo are
trademarks or registered trademarks of Software Research, Inc. All
other systems are either trademarks or registered trademarks of
their respective companies.

========================================================================
        -------->>> QTN SUBSCRIPTION INFORMATION <<<--------
========================================================================

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to
CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please
use the convenient Subscribe/Unsubscribe facility at:

       <http://www.soft.com/News/QTN-Online/subscribe.html>.

As a backup you may send Email direct to <qtn@sr-corp.com> as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:
           subscribe <Email-address>

   TO UNSUBSCRIBE: Include this phrase in the body of your message:
           unsubscribe <Email-address>

Please, when using either method to subscribe or unsubscribe, type
the <Email-address> exactly and completely.  Requests to unsubscribe
that do not match an email address on the subscriber list are
ignored.

		QUALITY TECHNIQUES NEWSLETTER
		Software Research, Inc.
		1663 Mission Street, Suite 400
		San Francisco, CA  94103  USA

		Phone:     +1 (415) 861-2800
		Toll Free: +1 (800) 942-SOFT (USA Only)
		Fax:       +1 (415) 861-9801
		Email:     qtn@sr-corp.com
		Web:       <http://www.soft.com/News/QTN-Online>