A common difficulty
faced during execution of portal projects has been in the technical
justification of the choice portal product. The observation has been
that in most cases, the business requirements could be met by all of the
leading products in the industry. This is exacerbated in the case of
portal implementations using Java Technology, as almost all products
confirm to industry standards and can be used in combination with any other Application server or Web server. In order to address this issue, a performance test of the leading industry products needs to be conducted.
The purpose of this exercise is to address the parameters and the environment for such a performance test. It also provides future directions as to how the set up can be extended.
The main benefits from this exercise would be:
Issues not addressed
The exercise is not intended to address the following concerns, though this could be extended to address these later:
The purpose of this exercise is to address the parameters and the environment for such a performance test. It also provides future directions as to how the set up can be extended.
The main benefits from this exercise would be:
- Provides a basis for suggesting hardware architecture for portal implementations.
- Provides a basis for comparison between various portal servers, although with some caveats.
- The test set up could serve as a training ground for further expansion and knowledge gain.
- Provide design inputs for portal applications based on the results.
Issues not addressed
The exercise is not intended to address the following concerns, though this could be extended to address these later:
- Performance testing of the different portal servers across different environments. It is a known issue that some portal servers work better in specific environments. However the objective here would be to evaluate all the portal servers on standard minimum environment.
- Performance testing scenarios with clustered environments at various levels or multiple CPUs per server. The primary motive not to do this would be cost and set up effort involved.
- End-to-End performance testing including application server, database server, directory server etc. These factors would induce additional complexity leading to a focus loss for the entire exercise.
- Trying to provide industry standard bench marking figures. By not addressing the points mentioned above, it would not be possible to claim the results of the performance test as an industry-wide benchmark.
The following table gives a brief description of the various factors that need to be measured.
Factor
|
Description
|
Transaction Time
|
The time required to complete one complete transaction by the server
|
Errors rate
|
The number of errors generated at the server per second
|
Processor Utilization
|
The percentage of processor time used by the process
|
Response Time
|
The time taken by the server to respond to the request, including the network delays
|
Memory Usage
|
The memory usage pattern
|
These factors need to be measured against varying values of the following base factors:
Factor
|
Description
|
No of Tabs to be loaded
|
The number of tabs to be loaded for the user
|
No of users
|
The total number of users simulated by the test
|
No of user groups
|
The number of user groups
|
No of user roles
|
The user roles
|
No of portlets/tab displayed to the user
|
The number of portlets to be displayed to the user
|
The following factors would remain constant for all the tests.
Factor
|
Description
|
Type of Portlets
|
Whether the portlets are Local or Remote. If local, are the portlets based on XML, JSP or URL scraping
|
Hardware Configuration
|
The hardware configuration of the server
|
Logging Level
|
The logging level in the web server
|
Think Time
|
The think time configuration
|
Keying Delay Time
|
The Keying delay time configuration
|
Client Machine configuration
|
The configuration of the machine which hosts the browser.
|
Authentication mechanism
|
The authentication mechanism for the portal application
|
No comments:
Post a Comment