|
This paper will describe an approach to developing a test plan to test the impact on the web server and on the client-side browser of adding a new, web-based application. The reality of carrying out the test with a real product under development will then be described. The test platform will include NT Server 4.0, Microsoft Internet Information Service (IIS) 4.0, and Microsoft Internet Explorer 5.0 at the minimum.The goal includes writing a repeatable test plan that can be re-used as revisions are made to the software.
An additional goal is to gather a baseline measurement of the system running the new application. The baseline measurement will then be compared with future releases of the software to assess the performance impact of modifications to the application.
Included in the detail test plan will be a report template that will be used to communicate the findings each time the performance test is run.
Why a load simulation on the web server is not enough:
In this case, the application has a component that runs as part of the web server as well as Java function calls that are executed as the page is rendered on the browser. A load simulation that simulates "GET's" to the web server exercises the web server, including the application component that runs there. However, if the simulated load does not include the use of real browsers, then a browser is not rendering the web pages and the Java function calls are not getting executed. Therefore, a simulation of a browser load on the web server does not exercise all of this application and does not give a true performance picture of the application.
However, such a simulated load does serve one purpose in our test scenario. We can use it to baseline the component that runs as part of the web server. This baseline can be used to compare with future releases of this component of the application.
Tools Needed:
The use of several types of tools will be necessary to generate the performance data. Each of the tools used will be described.
The Reality of Running the Test:
The real-world experience of actually making the performance runs to gather data will be discussed. Additional values of collecting performance data, including the uncovering of hidden errors in the software were found The final measurements used in the analysis will be presented in a report template.
Pat Garverick is a member of the E-Business Quality Assurance Team at Landmark Systems Corporation (LDMK NASDAQ). Her main professional interest is the methodology of implementation of requirements-based testing in a software vendor environment. Determining the acceptable performance levels for web-based applications as well as developing realistic methods of testing the performance impact of new software being added to an existing system are important components of writing and testing against requirements.