Check out the new USENIX Web site. next up previous
Next: Availability Up: Performance Patterns: Automated Scenario Evaluation Previous: Using the DSKI

Conclusions and Future Work  

The performance of CORBA based applications implemented as sets of objects is greatly influenced by by the application context and by the performance of the ORB endsystem. Application developers need to evaluate how candidate application object architectures will perform within heterogenous computing environments, but a lack of standard and user extendable performance benchmark suites exercising all aspects of the ORB endsystem under realistic application scenarios makes this difficult. This paper introduced the Performance Pattern Language and the Performance Measurement Object which address these problems by providing, under NetSpec control, an automated script based framework within which extensive ORB endsystem performance benchmarks may be efficiently described and automatically executed.

The tools described are implemented, and the viability of the framework they provide has been demonstrated by implementation of small but non-trivial sets of performance evaluation scripts. The examples presented show that the full range of evaluation information can be gathered and a rich set of performance scenarios examined. The automated nature of the script driven framework is also important because it makes it possible to describe and conduct a large set of evaluation experiments covering a adequately diverse and detailed set of scenarios and performance metrics.

Performance evaluation of CORBA based distributed applications, and of candidate object architectures, is an extremely important and difficult problem. Current benchmarking and testing methods are not as comprehensive as they might be because the scale and complexity required is daunting. The tools described here make a significant increase in the scale, complexity, and level of detail of performance evaluation studies possible, thus significantly advancing the state of the art.

Our future work will include creation of new test types and performance patterns. We are particularly interested in extending this approach to testing to include execution of applications under real-time constraints. We will use this set of tests to drive an investigation of what kinds of system support can be used to improve real-time performance of ORB based applications. We will concentrate on a time constrained event service and integration of operating system scheduling, I/O, and ORB level operations to improve time constrained communication among objects.


next up previous
Next: Availability Up: Performance Patterns: Automated Scenario Evaluation Previous: Using the DSKI
Sridhar Nimmagadda
3/23/1999