Since its introduction over three years ago, Enterprise JavaBeans (EJB)TM technology has maintained unprecedented momentum among platform providers and enterprise development teams alike. This is due to fact that the EJB server-side component model simplifies development of middleware components that are transactional, scalable, and portable. J2EE application servers supporting EJB technology reduce the complexity of developing middleware by providing automatic support for middleware services such as transactions, security, database connectivity, and more. To accomplish this, J2EE application server products assume all the burden of this complexity, and are gradually becoming the operating system on which all future server-side applications will be built.
The increasing complexity of J2EE application server products therefore has made their performance evaluation and benchmarking somewhat of a black art. Evaluating J2EE products has required a lot of effort and can be very subjective. Some evaluation criteria, such as quality of vendor support and company viability, are necessarily subjective. On the other hand, other criteria, such as performance, scalability, standards conformance and correctness of the implementation need not be subjective if carefully crafted, realistic benchmarks and well defined procedures for running those benchmarks are available.
In response to the need for a realistic benchmark for J2EE servers, the ECperf benchmark and kit has been developed by the Java Community, through the Java Community Process. ECperf is the standard benchmark for J2EE Enterprise Application Servers. This article provides 8 reasons why ECperf is the right way to evaluate the performance of J2EE servers. It is an excerpt from a larger whitepaper on the subject provided by The Middleware Company, available in PDF .
Some of the unique aspects of the ECperf benchmark are that ECperf is a real-world application with the right level of complexity, that there are strict guidelines for reporting, reviewing, and publishing results, and that the Total Cost of Ownership (TCO) of the system running the benchmark must be included when reporting results.
This paper states the reasons that make ECperf the right way to evaluate J2EE application servers.
ECperf is an excellent benchmark for measuring the performance and scalability of J2EE enterprise applications, for the following eight reasons:
|Reason # 8||ECperf is gaining acceptance in the community as an
independent and neutral benchmark
ECperf was created by Sun Microsystems with the assistance of the Java community to objectively measure the performance of J2EE application servers. Practitioners who focus on EJB have been touting ECperf since its inception. Industry research firms and analysts have been learning more about ECperf and in recent months are recommending it as the only objective way to evaluate J2EE application servers.
|Reason # 7||ECperf has an application server focus|
Other benchmarks do exist to measure the performance of the client tier, web server tier, or the database server, but not the middle tier. ECperf, on the other hand, measures the performance of operations on business objects in the middle tier.
In small to medium sized applications, the middle tier is often unimportant. Today, such applications are constructed as a combination of web presentation technology (e.g., Servlets & JSP) and database technology.
However, the middle tier becomes much more important as the application increases in size and complexity, or in the number of users or transactions it must handle. The center of the application becomes the middle tier the application server.
The performance and scalability of the application server, therefore, has a major impact on the long term health and cost of ownership of enterprise applications. And a benchmark that focuses on the application server is crucial in judging the quality of the application server.
|Reason # 6||ECperf is exemplifies modern software development techniques|
The ECperf application is designed and implemented as a set of interacting reusable and modular Enterprise JavaBeans (EJB). The application design reflects the state of the art in the design and implementation of distributed applications.
Contrast this to other benchmarks, which are designed only to 'make things go fast' or measure certain specific functionality, and are horrible examples of how one would really architect, design, and code a real-world application.
This aspect of ECperf becomes even more important as more and more companies move to basing their applications on EJB. These should use ECperf during their selection process for development and deployment environments.
|Reason # 5||ECperf is 'real-world' and has the right level of complexity|
The primary goal of the ECperf workload is to model the performance and scalability of J2EE enterprise applications as seen and implemented by real customers. Isolated low-level, or 'unit' benchmarks, such as tests of client-server round-trip requests, are insufficient as real-world benchmarks, because they test the performance of simple operations but not of complete applications.
The ECperf benchmark has the characteristics of real-world systems. For reasons of interest, scope, and familiarity, ECperf uses manufacturing, supply chain management, and order/inventory as the 'storyline' of the business problem. This is a meaty, industrial-strength distributed application. It is heavyweight, mission-critical, worldwide, 7x24, and necessitates use of a powerful, scalable infrastructure. Many Fortune 500 companies are interested in this application domain because they base their businesses on such IT systems.
The ECperf application captures intra-, extra-, and inter-company business processes. The application is easy to understand, straightforward to deploy, and run in reasonable time with excellent repeatability. ECperf thus has the right level of complexity for evaluating how real enterprise applications would perform on the J2EE environment being tested.
|Reason # 4||ECperf measures the completeness, compliance, and scalability of a J2EE environment|
The business problem that the ECperf application addresses, necessitates the use of distributed worldwide services and data whose exact location is not known a priori. The application manages persistent data, and needs to be highly available and secure.
In order to meet these requirements, ECperf uses the services required by enterprise applications, as defined by J2EE, including transactional components, distributed ACID transactions, naming services, object (enterprise bean) persistence (both bean-managed and container-managed), and others.
The business problem requires a completely scalable infrastructure. As the size of the modeled business grows, the services, the amount of data, the size of data, the number of users, the number of interactions per user, the number of transactions per time period, and the number of geographically distributed sites also grow. To cope, the server environment must adjust the sizes or numbers of CPUs, memory, application server instances, Java virtual machines, threads, clusters, connection pools, databases, and much more. 'Scalability' is a measure of how well an application server makes use of these additional resources.
ECperf measures how effectively a J2EE environment scales. This aspect of ECperf, especially combined with the aspect of Total Cost of Ownership (Reason #1), makes ECperf the obvious way to compare J2EE application servers.
All J2EE compliant application server products should be able to run ECperf. As such, the performance and scaling capabilities of different application servers can be compared.
This ensures that the application server running the ECperf benchmark is indeed a compliant, robust and complete implementation of the J2EE standard.
The J2EE certification test suite, of course, contains numerous unit tests for J2EE compliance. However, ECperf is the first large compliant application developed via the Java Community Process that will stress test much of the J2EE funcationality all at once, in a non-trivial manner.
Beware of the J2EE compliance of application servers that 'cannot run' the ECperf application out of the box.
For more information on J2EE compliance, completeness, and compatibility, please visit http://java.sun.com/j2ee/compatibility.html.
|Reason # 3||ECperf prohibits the modification of the application code and the SQL it executes|
With ECperf, the benchmark measures the performance and scalability of an entire application. ECperf prohibits the modification of the application and the SQL the application executes. As such, the benchmark results from different application servers are comparable. Application server vendors are forced to focus on the tuning of the application server itself and not allowed to play games by tweaking the application or its SQL.
On the other hand, ECperf does not prohibit, but encourages, the modification of deployment aspects of the application. Deployment aspects include the content of deployment descriptors, how the application is packaged, how many containers, machines, Java virtual machines, clusters are in the deployed system, how beans are allocated to containers, etc. This provides the flexibility needed to get the best run time performance and scalability out of an application server.
|Reason # 2||ECperf specifies guidelines on how to submit, report, and publish results|
Many benchmarks, and their results, are malleable and can be susceptible to being twisted into whatever an application server vendor wants to demonstrate. This is especially true of 'benchmarks' invented by application server vendors, but can also be true of independent benchmarks.
To avoid this problem, the ECperf Expert Community has created strict guidelines on submitting and reporting results, which prevent such twisting. The guidelines ensure that the results of the benchmark are accurate, and can be verified and duplicated given the appropriate software and documentation.
|Reason # 1||ECperf helps you estimate the Total Cost of Ownership (TCO) of an application over time|
An ECperf test produces a price-performance metric, the price per business operation per minute. A complete system configuration is priced, the benchmark is run and the metric is calculated by dividing the total system price by the number of business operations per minute.
The cost of the system must be reported with the results.
The cost of the system includes:
- the hardware cost (application server & database server machines)
- any operating system cost
- and the cost of the application server and database server license(s).
This aspect of ECperf makes the results of the benchmarks across application server products, operating systems, and hardware, especially accurate and useful.
As you scale the tested system to meet hypothesized increased demand, you reprice the tested system, rerun the benchmark and recompute the price-performance metric. This estimates how much the application will cost over time as your needs grow.
This section outlined the reasons for the appropriateness of ECperf that, in our experience, are applicable to most environments. Not all of these reasons may apply to yours; however, most will, or should.
The preceding was an excerpt from "Eight Reasons ECperf is the Right Way to Evaluate J2EE Performance". The complete PDF version of this document includes an overview of the ECperf benchmark application.
For the complete document in PDF format, please view:
8 Reasons why ECperf is the Right way to Evaluate J2EE Performance
Visit ECperf on TheServerSide.com: