sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======              June 2000              =======+

QUALITY TECHNIQUES NEWSLETTER (QTN) (Previously Testing Techniques
Newsletter) is E-mailed monthly to subscribers worldwide to support the
Software Research, Inc. (SR), TestWorks, QualityLabs, and eValid WebTest
Services user community and to provide information of general use to the
worldwide software and internet quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN provided that the entire
document/file is kept intact and this complete copyright notice appears
with it in all copies.  (c) Copyright 2003 by Software Research, Inc.


   o  QW2000 Conference Summary

   o  New Newsletter for IT Managers by Patrick O'Beirne

   o  Call for Papers/Presentations: QWE2K

   o  TSEPM Subscriber Notes

   o  Quality Week 2000 - "Ask the Quality Experts!" Panel Summary (Part
      1 of 3)

   o  New "Honor" Virus Discovered; Commented Upon

   o  eValid: Changing the Way You Think About WebSite Testing

   o  Real World Strategies for Improving the Test Process

   o  QTN Article Submittal, Subscription Information


                       QW2000 Conference Summary

It was another successful year for the QW2000, held at the Hyatt Regency
in beautiful downtown San Francisco.  We would like to extend our
sincerest gratitude to everyone involved, attendees, speakers, Advisory
Board members, Exhibitors and Sponsors for once again helping make
QW2000 a huge success.  This year our attendance exceeded our
expectation of over 1000 attendees! QW truly continues its reputation as
"The Premier Event" of the software quality community.

  * Conference Pictures

    Visit our web site to check out pictures of the Conference, Expo and
    various Conference Receptions.  For a look yourself Go to the QW2000
    Photo Album:

  * Special Events

    Attendees were entertained by an evening at the new Pac Bell
    Baseball Park, watching the SF Giants beat the Philadelphia Phillies
    and an evening at the San Francisco Museum of Modern Art, viewing
    surrealist paintings from the highly acclaimed Belgium artist

  * Demographics

    About 1025 people attended QW2K, a record figure for the event.
    Attendees traveled from such far-away countries as China, HongKong,
    Thailand, Taiwan, Argentina, Peru, India and Kuwait.  Attendees and
    speakers came from over 500 different companies in all.  We had 35
    companies making presentations and offering tools and services at
    the QW2K Expo.

  * Best Paper Award, Best Presentation Award

    The QW2000 Best Paper Award, as voted by the advisory board, went to
    Dr. Jerry Gao (San Jose State University) for his paper "Design for
    Testability of Software Components (6A2)".

    The Best Presentation Award, as voted by the attendees, went to
    Alberto Savoia (Velogic Inc.).  Alberto will be presenting his talk,
    "The Science of Web Site Testing (3W1)," at QWE2000, being held in
    Brussels, Belgium on 20-24 November 2000.  Congratulations to both!

  * CD-ROM & Proceedings

    If you weren't able to join us or would like to surprise a friend
    with a copy, the QW2000 CD-ROM & Proceedings are available for sale.
    They are $50 each and contain all the conference papers and
    presentation materials, as well as Exhibitor and Sponsor
    information.  You can conveniently order through the web site at:

    or phone, FAX, or send email to .


           New Newsletter for IT Managers by Parick O'Beirne

You may remember that for three years I had a monthly column in "Irish
Computer" magazine on how to solve Year 2000 problems. Well, after a
rest from that I have now decided to set up a newsletter especially for
IT managers, with a particular focus on the euro changeover, software
quality, and occasional excursions into hot topics like information
security, viruses, e-business, and the usual stuff of our IT life.

My June newsletter is a report on:

Enterprises 2002
Round Table on SME preparations, hosted by the European
        Commission at Brussels, 6 June 2000.

        Take up of the euro
        Perception and Risk
        Introduction of the Notes and Coins
        Accounting System Changeover
        Software and Systems Implications
        Actions and Incentives
        Euro Cross-border payments
        SME case studies

To subscribe to this newsletter visit:


                        Call For Papers/Presentations

       4th Annual International Software & Internet Quality Week/Europe

                             20-24 November 2000

                              Brussels, Belgium

                 CONFERENCE THEME: Initiatives For The Future


QWE2000 is the fourth in the continuing series of International Software &
Internet Quality Week/Europe Conferences that focus on advances in internet &
software test technology, quality control processes, software system safety
and risk management, WebSite performance and reliability, and improved
software process.

QWE2000 papers are reviewed and selected by a distinguished International
Advisory Board made up of industrial and academic experts from Europe and
North America.  The QWE2000 Conference is sponsored by SR/Institute, Inc.

The QWE2000 Conference Theme, Initiatives For The Future, focuses attention on
the opportunities for improvement and advancement in the internet and
client/server fields for the coming decades.

The mission of the QWE2000 Conference is to increase awareness of the
importance of internet & software quality and the methods used to achieve it.
QWE2000 seeks to promote internet & software quality by providing
technological education and opportunities for information and exchange of
experience within the software development and testing community.


QWE2000 is soliciting 45-minute and 90-minute presentations, full & half-day
standard seminar/tutorial proposals, 90-minute mini-tutorial proposals, and
proposals for participation in a panel and "hot topic" discussions on any area
of internet & software testing and automation.


        Abstracts and Proposals Due:            30 June 2000

        Notification of Participation:          1 August 2000

        Camera Ready Materials Due:             22 September 2000


Abstracts and session proposals should be 1-2 pages long, and should provide
enough detail to give the QWE2000 International Advisory Board an
understanding of the final paper/presentation.

Please include with your submission:
   o  The paper title, complete postal mailing and e-mail address(es), and
      telephone and FAX number(s) of each author.
   o  The primary author -- who is presumed to be the presenter -- should be
      named first.
   o  Three keywords or key phrases that describe the paper.
   o  A brief biographical sketch of each author.
   o  One photo [of the primary author].

Please indicate if your target audience for your paper/presentation is:
   o  Application Oriented
   o  Management Oriented
   o  Technical or Technology Related
   o  Internet/Web Oriented

Also, please indicate if the basis of your paper/presentation is:
   o  Work Experience
   o  Opinions/Perspectives
   o  Academic Research

You can complete your submission in a number of ways:

   o  Email your abstract and other information to  The material
      should either be an ASCII file or in PDF format.  Be sure to include all
      of your contact information.  (This method is preferred.)

   o  Mail your abstract (or send any other questions you may have) to:

      Ms. Rita Bral
      Executive Director
      Software Research Institute
      1663 Mission Street, Suite 400
      San Francisco, CA  94103  USA

      Phone: [+1] (415) 861-2800
      FAX: [+1] (415) 861-9801

For Exhibits and Vendor Registration for the QWE2000 Conference, please call
or FAX or Email your request to the attention of the QWE2000 Exhibits Manager
You should contact the QWE2000 team as early as possible because exhibit space
is strictly limited.


                            TSEPM Subscriber Notes

Well, the May issue will be out within the next 3 days.  We've almost "caught
up" to getting the issue out on-time, but not quite.  We hope to be fully
caught up with the June issue.

Our advertising sponsor for this issue is MarotzTel, a developer of custom
telecommunication billing systems.  Unlike general purpose software
development shops, MarotzTel has developed templates, requirement taxonomies,
processes, planning models, and reusable components and architectures all
focused on telecommunication billing.  They have built several successful
billing systems for telecommunication companies, ISPs, and others in the
telecom industry.  If you have a need in this area, or are aware of someone
they should be talking to, please visit their web site at .

William H. Roetzheim
Trends in Software Engineering Process Management (TSEPM)
13518 Jamul Drive,
Jamul, CA  91935-1635  USA


        Quality Week 2000 - "Ask the Quality Experts!" Panel Summary
                                (Part 1 of 3)

      Note: This discussion is the approximate transcript of the "Ask
      the Quality Experts!" panel session at QW2000.  The questions were
      posed to a web page sponsored by Microsoft (courtesy of Panel
      Chair, Nick Borelli) who voted on each topic.  The top-voted
      questions were answered first.

                   Ask the Quality Experts! Panel Members:

                       Mr. Nick Borelli, Microsoft, USA
                Dr. John D. Musa, Independent Consultant, USA
                       Prof. Lee Osterweil, UMASS, USA
                         Mr. Thomas Drake, ICCI, USA
                   Mr. Robert Binder, RBSC Corporation, USA

*** What is the best way to test a website?

The internet track at Quality Week this year provided a lot of information.
Would recommend "Notes From the Front Lines:  How to Test Anything and
Everything on a Web Site" by Ted Fuller if you didn't get a chance to attend.
Most software testing practices apply.  Change control, bug tracking, release
management, triage, spec reviews, etc...  Special cases do exist - web testing
tools different (link checkers, code validators, load testing, multi-user,
peak or event driven), unique browsers & configurations, security,
reliability, usability, and performance.

It's important to identify what product is that you're testing.  Then develop
test plan.  Pitfalls to watch out for - don't make assumptions because it's
just a web page, it still adds up to bugs and data loss.  Cost of bad quality
with a web application can be on a much higher scale.  With "normal" software,
the user might call customer support.  Whereas an outage with a web
application can cost thousands of dollars per minute.  It's good to have a
test plan to address this possible financial damage.  The traditional model
doesn't hold in web environment, for example with production drops.  End to
end transaction-based process needs more attention for web-based applications.

If you have a web page where user identity is critical to correct delivery of
content, traverse links inside and outside of site to see if you can find a
place where the users' identity can be lost. Knowing user expectations are key
as is sizing the hardware appropriately.  Usability important, it needs to be
easy to use. Content is important.  Not that much different to test
reliability than conventional software program.

Lots of vendors selling test tools, check out their products to help test your
web application.

*** I have no prior experience in QA/Testing, but I am our only tester.  Where
do I start?

Acquire understanding or knowledge about what you've been given to test.  Make
sure to ask - "What do you want to know?"  In the end you don't want your
customer to be unhappy.  Insist that others in the organization be specific
about what they want out of the role.

Find out how users are going to use the system.

Seek out the customer.  Who is the intended customer?  Where they live, what
is their domain?

Set expectations on what you plan to accomplish since you'll probably have a
one to many relationship with development organization.

Ask what you're doing as the only tester with no experience.

Get a hold of some of the good available books like Testing in the Real World.
Determine worst possible situation and exercise that.

*** How do I pass an ISO9000 audit?

Document process thoroughly and accurately.  Individuals will come in to test
how you're doing the process.  Educate the people in your organization on what
you've documented so that there's consistency.  There are some negatives to
it; one being it takes a lot of time.  In this particular example, it took two
staff weeks.  It can also inhibit process improvement since once the audit is
approved it can be hard to change.

One panelist suggested that this is the wrong question and gave an example of
a time when a professor gave out the final exam questions at the beginning of
the course.  Many of the students focused on just getting answers to those
questions without attempting to gather any other knowledge throughout the

The ISO9000 audit should be a by-product of what they're really up to.

*** What's the best software inspection method?

Requirements.  If you do something up front it has more impact.  The greatest
contributions are made by getting a bunch of diverse people together and have
them each do the inspection.  Had a lot more value doing this versus having
folks come together in a meeting and building consensus.

Not as necessary to have face-to-face meetings as people thought.  Software
inspections can be quite effective when people are dispersed across the

There are several competing software inspection methods.  Not familiar with
all of them.  Activity of asking people to thing carefully about what they're
looking at.  Ask producers of software to think more carefully about what
they're producing.  Best inspection process is one that's actually done.

Really supports power of inspections.  1/2 of all errors are actually
discovered at the requirement level.  By testing the requirements up front you
can save a lot of time in testing.

Question from audience on the different levels of inspections.  Reviews are
typically with a larger audience, larger project.  The idea is you get several
people together with different points of view.

What matters is not the performance of the view but the preparation of the

Also important is the diversity of the reviewers

The real question is:  What are you actually looking for?

*** Does test automation really work? If not, why not?


Maybe or sometimes.  A lot of things can be automated, a lot of things
couldn't possibly automated.  The important question is:  "What are you trying
to do?"  If you know that, then you can decide if automation will help you.
This question is vague.

I'm a big proponent of test automation.  Important consideration is expected
life of the test suite versus expected labor.  Test automation can be buggy
and is never as easy to do as you might think.  It may take months and months
of work before it is usable.  Doing tests over and over and over without
automating them is just plain stupid.

Question is:  "What is appropriate to automate?"  Use appropriate technology
for appropriate tests.

Is it true that testers are usually uneasy about whether the testing executed
was enough or not? For this seems like setting an "Exit Criteria" is the way
to go. Let me hear in detail about this "Exit Criteria".

Any good tester should feel uneasy when you are about to ship.  Every time I
have been around ship mode I have been second-guessing, it comes with the
territory.  As far as exit criteria, does a checklist help me to feel better?
No, but all along in the product we should have milestones with certain exit
criteria.  You can know how much testing was done and how many changes were
made.  Important questions can be answered like:  What is the mean time
between failure?  Are we checking the developers?  But at the end game there
is a judgment process that is inevitable.  The triage process is critical.  It
considers cost of delay, risk, cost of change, etc.  If you have Exit
Criteria, make sure you communicate to your test team that you might waive
part of your criteria.

There is very little chance you will have a 100% assurance on anything.  Make
sure that your client understands that there is no such thing as exhaustive
testing.  Agree on the degree of assurance for the different factors.  Do not
take on a task that says there are no bugs in the product.  Make reasonable
requirements, and create a test plan that stands a good chance of meeting
those requirements.

Set Priorities.  User test, and use that as an exit criteria.  This can help
to fill in documentation.  Focus on whether you have met customer needs.

*** Whatever happened to "mutation testing"?

This was an idea developed where you take a set of test cases, and for that
set determine the validity of those cases.  To do this, you'd take the
program, create all these mutants, run the data through the mutants, and see
what percentage of the data ran through and returned something different.  If
> 90%, then the test data was probably pretty good.

There are still folks doing research in this area.  There is a conference
going to be held this year.

People who were behind the technology were making outlandish claims about its
usefulness, which hurt its credibility and satisfaction.

Quality of test data sets (test cases) can be tested.If you want to know about
one thing, look at its dual.

If you want to know what a function is, look at the function inputs and

As time has gone on, people have tended to forget about it.  The upcoming
conference may bring it into the forefront.

                              (To Be Continued)


                 New "Honor" Virus Discovered; Commented Upon

We received the following notification about a new HS virus, to wit:

> From: 
> Subject: New Virus...
> To: QTN Readers
> This virus works on the honor system.
>          Please delete all the files on your hard disk, then forward
>          this message to everyone you know.
> Thank you for your cooperation.

In keeping with our usual procedure to check these kinds of things out we sent
this notification on to several nationally known software testing and virus
detection/prevention experts, one of whom responded thusly:

> From: 
> To: "Edward Miller" 
> Subject: Re: New Virus...
> It just goes to show how inept virus makers can be.  Obviously, you
> should first forward the message to everyone you know and only then only
> then delete all the files.  How can you send this virus out, or anything
> out, for that matter, after you have deleted all the files on your hard
> drive?
> It reminds me of the barbarian explaining procedures to an underling:
> "No, first rape, then, pillage, and only after, burn -- not burn, pillage,
> rape."
> Despite the obvious flaws in the above virus, note that this one
> has probably had a much greater circulation than the original
> "I Love You" virus.


           eValid: Changing the Way You Think About WebSite Testing

Our new eValid(tm) family of WebSite test products and WebSite test services
aims to change the way you think about testing a WebSite.

This new consolidated offering -- which integrates our unique testing, tuning,
loading, and monitoring technology under a single brand name -- is based on
eValid's unique Test Enabled Web Browser(tm).  The eValid engine runs on
Windows 98/NT/2000.

eValid is a user-oriented test engine, our eValid monitoring service platform,
and the internal technology basis for our WebSite Quality consulting work:
producing load experiments, building complete WebSite Test Suites, and doing
WebSite page tuning.

eValid as a test engine performs essentially all functions needed for detailed
WebSite static and dynamic testing, QA/Validation, and load generation.
eValid has native capabilities that handle WebSite features that are
difficult, awkward, or even impossible with other methods such as those based
on viewing a WebSite from the Windows OS level.

eValid has a very rich feature set:

  * Intuitive on-browser GUI and on-web documentation.
  * Recording and playback of sessions in combined true-time and
    object mode.
  * Fully editable recordings/scripts.
  * Pause/SingleStep/Resume control for script checkout.
  * Performance timings to 1 msec resolution.
  * Content validation, HTML document features, URLs, selected text
    fragments, selected images, and all images and applets.
  * JavaScript and VBScript fully supported.
  * Advanced Recording feature for Java applets and ActiveX controls.
  * Event, timing, performance, tuning, and history charts that
    display current performance data.
  * Wizards to  create scripts that exercise links on a page, push all
    buttons on a FORM, and manipulate a FORM's complete contents, etc.
  * The LoadTest feature to chain scripts into realistic load testing
  * Log files are all spread-sheet ready.
  * Cache management (play back tests with no cache or an initially
    empty cache).

Try out a DEMO Version of eValid (it's limited function; no key required!)
by downloading from:

Or, download the FULL version and request an EVAL key from:

The powerful eValid LoadTest feature is described at:

Take a quick look at the eValid GUI and other material about the product

A detailed feature/benefit analysis of eValid can be found at:

An order form that gives current eValid product pricing is found at

NOTE: To block eValid announcements send email to .


          Real World Strategies for Improving the Test Process

                             By Marco Dekkers
                          ProductManager Testing
                         KZA kwaliteitszorg B.V.


Test is no longer consider a four-letter-word in modern software
development organizations (although technically it is). With the increased
interest in structured testing the subject of improving the test-process
has also become a valid point of discussion. To support efforts in this
field several models for test-process improvement have been developed in
recent years. In addition to providing a theoretical reference for
improving the process, these models also strive to inform the user on
practical issues regarding effective ways of implementation.  Examples of
these models include the Test Improvement Model, Test Organization
Maturity, Testability Maturity Model, Test Process Improvement and the
Testing Maturity Model. One thing most of these models have in common is
that they draw a parallel with the SW-Capability Maturity Model (SW-CMM).
An other common factor is that most of them were developed after 1995.
Although organizations can benefit from using of one or more of these
models, there are significant drawbacks when using them. Examples of these

    * Questionnaires used to ascertain the maturity of the test-process
      sometimes contain an excessive number of questions

    * Typically behavior is diagnosed and not the effects of that
      behavior. For instance, problems organizations are faced with are
      not included in the diagnosis

    * Improvement suggestions are generic and have to be tailored to
      specific circumstances. Most models do not offer any assistance
      towards this end

    * Organizational goals are not taken into account in some models

To achieve the benefits of test-process improvement this article offers
some real-world advice on how to plan, execute and evaluate an improvement
program. Models are viewed as tools that can be used to the organization's
benefit, if used with care. Some general advice on the use of models
includes: use the model as a checklist not as a guideline, focus on
organizational problems not on applying the advice contained in models,
focus on measurable goals and not on maturity levels and above all be

Based on several years of experience in the field of testing and test-
process improvement, KZA has developed a set of best practices.  Together
these lay the foundation for an effective strategy regarding test-process

Before the KZA approach is discussed, it is important to establish in
which circumstances test-process improvement is desirable. One relevant
consideration is whether IT and the quality of software play an important
role in the success of the organization. If not, putting effort into
improving the test-process is not likely to be of particular relevance.

Generally speaking IT plays a central role in most modern organizations,
so in most cases the answer to this question will be "yes".  An other
important factor is whether the organization is willing and able to
change. Willingness typically only exists if there is general awareness of
(the impact of) problems resulting from deficiencies in the test-process.
Also goals have to be stated clearly (as will be discussed in the next
paragraph) and the expected benefits have to outweigh the costs.

The starting point is the formulation of goals the organization wishes to
achieve. These should be specific, measurable, acceptable for the parties
involved, realistic and placed within a specific timeframe. The goals for
test-process improvement should be in line with, and contribute towards
reaching, business goals. Next the current state of affairs is assessed
using a short questionnaire. This focuses not only on behavior but also on
the problems the organization is faced with. Interviews are used to
develop a more complete understanding of the situation.  After it has
become clear what the goals are and the current situation has been
assessed, attention is shifted to identifying possible solutions. During
this phase it is of particular importance to involve members of the
organization. A consultant facilitates the process, but lets others
generate as much ideas as possible. Improvement suggestions are then
checked against a set of criteria.

Examples of these criteria include:  actions should contribute towards
reaching the goals, adequate resources can be made available and
management and workers are willing to support the proposed measures.
Improvement suggestions that satisfy the criteria are then discussed with
representatives of the organization. This can lead to valuable adjustments
(improvements) and significantly contributes towards acceptance. The next
phase consists of drawing up an action plan and gaining management
approval. A cost/benefit calculation is necessary to make the business
case for test-process improvement.  After the plan has been approved a
project manager is put in charge of an improvement team, which is made up
of members of the departments involved. A critical success factor is
effective communication with all layers of the organization. Change
management skills and techniques are also of vital importance since
resistance to change is likely to increase during the implementation
phase. Effective communication regarding the goals and activities,
training, support and negotiation are the keys to success. In order to
establish the effectiveness of the process improvement effort results are
evaluated both on a periodic and an event-driven basis. This may lead to
corrective actions.

In closing I would like to give some general advice. In contrast to
widespread notions, testing is not conducted in a vacuum. If possible not
only the test-process, but the entire software development process has to
be improved upon. This will prevent sub-optimization from occurring. Also
be sure to incorporate "quick wins" in your action plan so results are not
only achieved "down the line", but rather can be demonstrated in the short
term. This will improve your chances of not only gaining, but also
maintaining management and workforce support.  Involve as many people as
possible in order to gain the necessary commitments. Be sure the
improvement team does not become a separate entity, separated from the
rest of the organization by a virtual brick wall. Widespread involvement
stimulates awareness, commitment and increases the chances of effective
implementation (in that order).

The approach described here is neither perfect nor revolutionary. It does
however work in the real world. Try it for yourself and let me know what
you think of it.

Currently efforts are under way to describe our approach towards test-
process improvement in more detail. During this process the approach will
be refined and adjusted to incorporate as many best practices and insights
as possible, therefore I'd be grateful to receive your feedback. I am very
interested in finding out what challenges you have been faced with when
trying to improve the test-process and how you have dealt with them.
Hopefully, with your help, we can further develop our approach and share
the results in a future issue of QTN.

       ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 9000 subscribers
worldwide.  To have your event listed in an upcoming issue E-mail a
complete description and full details of your Call for Papers or Call for
Participation to "".

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should provide at
  least a 1-month lead time from the QTN issue date.  For example,
  submission deadlines for "Calls for Papers" in the January issue of QTN
  On-Line should be for February and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research, Inc.,
  and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items are the opinions of their authors or
submitters; QTN disclaims any responsibility for their content.

Xvirtual, Xflight, STW/Regression, STW/Coverage, STW/Advisor, TCAT, and
the SR logo are trademarks or registered trademarks of Software Research,
Inc. All other systems are either trademarks or registered trademarks of
their respective companies.

           -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to CANCEL a current subscription, to CHANGE an
address (a CANCEL and a SUBSCRIBE combined) or to submit or propose an
article, use the convenient Subscribe/Unsubscribe facility at:


Or, send E-mail to "" as follows:

   TO SUBSCRIBE: Include this phrase in the body of your message:

           subscribe your-E-mail-address

   TO UNSUBSCRIBE: Include this phrase in the body of your message:

           unsubscribe your-E-mail-address

   NOTE: Please, when subscribing or unsubscribing, type YOUR email
   address, NOT the phrase "your-E-mail-address".

		Software Research, Inc.
		1663 Mission Street, Suite 400
		San Francisco, CA  94103  USA

		Phone:     +1 (415) 861-2800
		Toll Free: +1 (800) 942-SOFT (USA Only)
		Fax:       +1 (415) 861-9801
		Web:       <>

                                ## End ##