6.5 million software developers and still going strong

Java Developer Magazine

Subscribe to Java Developer Magazine: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Java Developer Magazine: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Java Developer Authors: Stackify Blog, Douglas Lyon, Glenda Sims, APM Blog, Pat Romanski

Related Topics: Java EE Journal, Java Developer Magazine

J2EE Journal: Article

Benchmark Bust-Up in Javaland

Benchmark Bust-Up in Javaland

(November 8, 2002) - The J2EE community got a major call to arms this week when the J2EE experts, The Middleware Company, released a report on the performance of the J2EE PetStore application running on both .NET and J2EE.

The report seemed to suggest that Microsoft's .NET performed much better than its equivalent J2EE implementation, outperforming it by a huge margin as opposed to beating it by a whisker. A time for rejoicing at the Redmond campus?

Well not quite; some Java developers believe that .NET was given a unfair edge. So what are their arguments?

The Story So Far...

Here are the circumstances, as far as JDJ News Desk has been able to establish them. The Middleware Company (TMC), the owner-managers of TheServerSide.com, approached Microsoft to aid in the production of a "fair" benchmark suite for the implementation of the J2EE PetStore application for both .NET and J2EE. The motivation for this initiative dates back to the first time Microsoft published such a report, at which time they claimed that the PetStore ran faster on .NET than it did on a leading J2EE application server.

The major J2EE companies came out in protest to against Microsoft's "findings" at that time (Oracle, IBM, and SUN for example), saying the benchmark program neglected to utilize the overall J2EE platform.

On the face of it, the re-match was straightforward: TMC would optimize and tune the PetStore application for J2EE, and Microsoft would be given the opportunity to optimize and tune their implementation for .NET. However the scales began to, shall we say, to tilt when Microsoft invited The Middleware Company to their offices, paying for all their travel and subsidiary costs to spend time at Microsoft's main offices in Seattle.

When asked by JDJ News Desk if TMC had made a similar suggestion to them to garner help configuring and optimizing their respective application servers, Sun, IBM, and BEA Systems all said that no approach was ever made to them.

"Benchmarking of this nature," comments Eric Stahl of BEA," where an application is chosen and run by someone for performance metrics, is a highly flawed methodology."

"The very reason for TPC, SPEC and the prior ECperf organization," he continues, "is to create a fair environment for running and reporting of these performance metrics in a way that properly compares the products in question. And, even with all of the overhead of a committee and its rules, the numbers can still be controversial."

"PetStore Not Designed for Performance Analysis," Says Sun

So was Sun's PetStore application the best application to use for this test?

Sun has always been quick to point to the fact that the PetStore application was never designed for performance analysis. As Glen Martin, Sun Microsystems' Lead Product Marketing Manager for J2EE, notes, "In the PetStore, given the primary goal of teaching, lucidity wins any contest. This is entirely different from the way the decision is made in real application development. People should use the PetStore to learn techniques and the application of patterns. Perhaps we haven't been clear enough on this point."

Many Java supporters have come out against this obvious oversight on TMC's part. JDJ's very own J2EE editor, Ajit Sagar, sums it up very nicely: "Comparisons are done for a purpose. If the purpose of this report was to give a fair analysis of how J2EE performs as compared to .NET, Sun as well as Microsoft should have been involved in deciding what the guidelines for the test should have been. This seems more like a "fixed" match. Was Sun approached for time, space, and resources as Microsoft was? Did they agree that Pet Store was the right application to test for performance (obviously that has not been its purpose)? Was the J2EE application deployed in Sun's facility (as the .NET one was in Redmond)? If the answer to these questions is NO, then the report is definitely one-sided and benefits Microsoft. It should be ignored by the J2EE community and treated as another marketing gimmick."

Not everyone believes, though, that TMC did a disservice to the community. Greg Leake, Lead Product Manager for .NET at Microsoft, notes: "TMC did choose a J2EE architecture that represents Sun's recommended approach using Entity Beans."

That may be very well be, but as any architect will tell you, the "recommended" approach is not always all the best approach in the real world. The Middleware Company are reputed custodians of this area, publishing well-known best practices for J2EE applications. But here lies the mystery: as Eric Stahl at BEA comments, "...the rewritten PetStore does not follow *any* of TMC's best practices. It's a joke."

TMC chose to put not one, but two leading J2EE vendors against Microsoft's .NET implementation. The two servers were not officially named, but it was obvious from the configuration files published which two servers were eventually used.

"Seriously Let Down the Java Community"

A startling fact, that both JDJ's editor-in-chief Alan Williamson and Glen Martin from Sun find hard to believe, is that "AppServer #B" failed to respond to any further requests after just 4 hours. Alan Williamson comments, "For this very reason alone, TMC had a duty to call in that particular J2EE vendor to make sure they exhausted all configuration options. This was a negligent error on a massive scale. TMC have seriously let down the Java community." Such issues wouldn't have affected the .NET implementation, as Microsoft had their own highly skilled .NET engineers poring over every single line and configuration option to ensure the best possible performance was squeezed out of it.

The question remains: why? Why would a highly respected J2EE authority be so careless? Rickard Öberg believes there is more politics to this debacle, "TMC really disqualified itself as they were recently bought by a company who has Microsoft as a strategic partner," he says - a reference to Precise Software Solutions, Inc.

"Whoever writes such a report needs to make sure that the results of it aren't released to any of the related parties before official publication," Öberg continues. He has published evidence of Microsoft having had access to the report before publication.

Leaving the potential skullduggery issues aside for the minute, from a technical level did TMC do the best possible job for the J2EE community? Rickard Öberg believes not, and has produced a rather detailed technical analysis on the PetStore implementation that the TMC used. It would appear TMC didn't even use the latest PetStore application. As Öberg points out: "The 'original PetStore' they used was 1.1.2, which is over two years old. The most recent PetStore (1.3.1) already have many of the optimizations I outline in the report."

Nigel Thomas of SpiritSoft agrees. "The comparison deals with just one possible (EJB -centric) application architecture; taking a straightforward tightly coupled/synchronous approach to application development. That's pretty much an 'entry level' approach," he continues, "that true application scalability comes from building loosely coupled applications, with non-urgent processing pushed into the background using (JMS) queues. That gives virtually unlimited scalability."

But can a benchmark be developed that would allow us to compare "apples with apples" as opposed to oranges? Many believe it can, if all parties can be brought together, and as Glen Martin says, "...a benchmark needs to be developed."

TMC has listened to the community and as a result will be re-running the tests, this time giving the J2EE application vendors the same opportunity afforded to the Microsoft team. Says Greg Leake from Microsoft, "We welcome this set of tests, and will eagerly participate with a .NET implementation of the benchmark application to showcase our technologies."

Looks like this passionate issue will go to Round #3, but this time, all parties will be participating, so the results should be far more meaningful.

JDJ News Desk looks look forward to that report.

External Resources:

The Middleware Company Report

Rickard Öberg Analysis

More Stories By Java News Desk

JDJ News Desk monitors the world of Java to present IT professionals with updates on technology advances, business trends, new products and standards in the Java and i-technology space.

Comments (51)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.