News: Giga: Do your own J2EE AppServer benchmarks!
- Posted by: sharat nellutla
- Posted on: March 15 2002 09:15 EST
* Editors Note - March 19th *
Floyd Marinescu: I spoke to John Meyer, who was allegedly interviewed in the article on cw360. The article on cw360 did not properly represent Giga's position and did not present their position in its entirety.
Analyst firm Giga Information Group has advised users to conduct their own Java application server performance tests as suppliers have been unwilling to provide comparable test data. According to Giga, Java application server suppliers have been unwilling to publish an "apples to apples" (comparable hardware and software configuration) on the ECPerf benchmark.
Read Do your own Java benchmarks
- Giga: Do your own J2EE AppServer benchmarks! by Ray Harrison on March 15 2002 12:50 EST
- Giga: Do your own J2EE AppServer benchmarks! by Cary Bloom on March 15 2002 13:56 EST
- Giga: Do your own J2EE AppServer benchmarks! by Rashid Jilani on March 15 2002 15:02 EST
- Giga: Do your own J2EE AppServer benchmarks! by Pankaj Kumar on March 16 2002 19:18 EST
- Giga: Do your own J2EE AppServer benchmarks! by Murali Varadarajan on March 16 2002 22:36 EST
- Giga: Do your own J2EE AppServer benchmarks! by Krishnan Subramanian on March 18 2002 11:53 EST
- Giga: Do your own J2EE AppServer benchmarks! by Tom Daly on March 23 2002 02:41 EST
Duh. I'm glad that Giga has been so brilliant as to come up with that solution. That has been advocated on this list, among other places. That's what my company has done - test the various app servers on in-house hardware and database solutions. It gives a truly excellent picture of how the app server will behave in the environment for which it has been or will be purchased. And we can look at them head-to-head on the same hardware/database solutions. Vendors aren't likely to that any time soon, but their customers can for internal purposes.
those are really smart people. so it took them five years to find that out. amazing.
These guys are actually paid to make these elementary recommendations?
Yeah - I've sent my resume to Giga so I can get paid too!
"Yeah - I've sent my resume to Giga so I can get paid too!"
You forget the part where you show leading results for whatever vendor pays you the most money.
How can I read it Sharat? When I am clicking on "Read Do your own Java benchmarks" I have been prompted for userid and passwd.
Click ok without entering anything - it will continue to the article.
The password seems to come up randomly.. You can also just go to www.cw360.com and get to the article manually.
It is interesting that the recommendation is to actually run ECPerf in-house. I would think that what makes more sense for a customer is to benchmark AppServers with the kind of application and load they are expecting.
Granted that characterizing the anticipated use and load conditions may not be feasible for most of the customers most of the time. Also, many may not have the time, resources and skill to develop specialised benchmarks. However, what is certainly feasible is to have a good understanding of certain usage patterns within the industry and to have specialized benchmarks. This would allow customers to pick a benchmark closer to their intended usage and then run it inhouse on their h/w. These benchmarks do not have to be even "official" standards.
In the long run, this would be better for AppServer vendors as well, so that they could tune their offerings for specific category of usage.
Who would create these? I think it makes sense for companies specializing in AppServer consulting, education or even vendors themselves. One could even make money on this.
Looking at the success of VolanoMark in JVM benchmarking, I think this approach makes a lot of sense.
Perfect... I agree on this.. No way we can run the bench mark on all the possible combinations. After looking at the latest Ecperf results i guess there might be a million( even billion ) combination of the Application Stack.
d)Databases ( Database drivers separately.. )
so no way they can do that .. Remember J2EE means we have lot of options :-)
>> In the long run, this would be better for AppServer vendors as well, so that they could tune their offerings for specific category of usage.
Yes they should start thinking in this direction.
>> Who would create these? I think it makes sense for companies specializing in AppServer consulting, education or even vendors themselves. One could even make money on this.
I guess this will be very expensive for the customer. Just because there are lot of options it is not fair to ask the customer to pay for the selection !!!.
I told this point earlier in this forum. Once again let me repeat. J2EE means choices. You can make any choice today & change it without any problem in future. Just as we buy a sports car for today's needs & later we trade-in for a Family Car . Similarly the Vendors should bring pricing mechanism in such a way there should be a policy of trade-in of these H/W & Software licenses( After all they are not cheap !!! ). Wish to see such pricing schemes.. ;)
If you're doing in-house testing, would it not be better to test your own applications, rather than ECPerf. Tools like JMeter make it pretty painless to develop these end-user load test simulations. I would have thought that this would provide more useful information than testing someone else's application.
It is interesting that the recommendation is to actually
>run ECPerf in-house. I would think that what makes more
>sense for a customer is to benchmark AppServers with the
>kind of application and load they are expecting.
Well, I would agree with you if it were that easy to port whatever application you write to a different AppServer vendor's product (not to mention knowing how to fine-tune them for every vendor's offering). It currently takes quite a while to port a complete [non-trivial] application to run satisfactorily on another vendor's AppServer.
ECperf on the other hand tests pretty much everything there is in an AppServer, and if vendors were to provide their own ECperf kits (with the vendor specific tuned deployment descriptors), it makes it that much easier for a customer evaluating AppServers to run the test in-house.
In the above case, all that would be needed is to edit the datasource configuration to point to whatever database and JDBC driver you use, and then off you go.
PS: If you did have time on your hand, you could port your own Application to different AppServer vendors and benchmark them. But as mentioned above, you would need to know how different AppServers work & how they can be tuned to get the most out of any benchmark. ECperf kits that way could be 'ready-to-run' benchmarks.
Well, I would agree with you if it were that easy
> to port whatever application you write to a different
> AppServer vendor's product.
I wasn't talking about a customer porting his/her application to different vendors' platforms. I was talking about specialized benchmarks, not necessarily built by customers, that are already ported on different platforms.
> ECperf on the other hand tests pretty much
> everything there is in an AppServer, and if vendors
> were to provide their own ECperf kits
I was reading an analysis about the economy and its relationship to monsoon ( seasonal rain ) for a large and very populous country. The analyst was puzzled that official records showed above normal rainfall for past three years, still the country, whose 72% population is employed by agriculture and related activities, faced sluggish demand for consumer goods. More data gathering revealed that there are different ways of "declaring" above average rainfall:
1. percentage of "sub-divisions" ( from a toal of 35 ) that had above normal rainfall.
2. percentage of "districts" ( from a total of more than 428 ) that had above normal rainfall.
3. total amount of rainfall was above normal.
It turned out that the official cliam of above normal rain was based on (1) but overlooked the fact that there was uneven distribution of rain in many of the large "sub-divisions", causing a significant percentage of population to suffer.
ECPerf may be a very well designed benchamrk but the final outcome, BBop per unit of money, could conceal important aspects that are relevant to a customer. The only way to rectify is to have different metrics focussing on different aspects.
Availability of "specialized benchamrks" ( and not "micro-benchmarks" ) would also make benchamrking an engineering effort and not a marketing effort, which ECPerf seem to have become.
I wasn't talking about a customer porting his/her
> application to different vendors' platforms. I was
> talking about specialized benchmarks, not necessarily
> built by customers, that are already ported on different
And what would ECperf be? A non-specialized benchmark? I suggest you take a look at the benchmark source code :) And it is a 'standardized' benchmark per se, developed & approved by the community (that is, including the J2EE vendors). If I interpret your remarks, you seem to want to do the same (again!).
> I was reading an analysis about the economy and its
> relationship . . .
I fail to see the relevance. I can appreciate you take the time to type it all in ;)
> ECPerf may be a very well designed benchamrk but the
> final outcome, BBop per unit of money, could conceal
> important aspects that are relevant to a customer. The
> only way to rectify is to have different metrics
> focussing on different aspects.
You seem to ignoring an important aspect here. ECperf was designed to be a benchmark to compare apples to apples. What it has [de]generated into as a result of marketing hype (as you yourself point out) is another thing altogether. Which is why Giga encourages an evaluator of J2EE vendors to run the benchmark (which as benchmarks go is probably the only standardized one there is - not counting Pet Store of course ;) themselves. In this case, there is no reason to be taken in by the marketing hype surrounding the ECperf benchmark.
PS: I am not disagreeing with you completely, but with any benchmark - you can expect vendors to hype up things and marketing departments to go overboard (that's inevitable except possibly in an autocracy ;) But that does not mean the benchmark (in itself) is a failure.
If I interpret your remarks, you seem to
> want to do the same (again!).
No, one "standard" benchamrk is enough.
However, that doesn't mean that all customers use only that to make their decisions. What I am saying that the customer needs vary and there is a case for specialized benchmarks that, in isolation or combination, can better serve him/her. Same way as lmbench is for systme calls, linpack for mathematical operations; we might need (i)J2EE benchmark for serving JSP pages serving a large no. of users; (ii) J2EE benchmark for Staeless/Statefull EJB operations; J2EE for transactional operations [ assuming that certain users of EJBs might forego transactions for speed ]; and so on ...
> I fail to see the relevance. I can appreciate
> you take the time to type it all in ;)
Consider this: If rainfall was a saleable commodity and the metric (1) was used to sell it but results were not obtained as metrics (2) and (3) came in the way, then (1) has simply failed the buyer in making a right decision. The parallel with any one benchmark is obvious. If it is not, I will let the matter rest here ...
> PS: I am not disagreeing with you completely,
> but with any benchmark - you can expect vendors
> to hype up things and marketing departments to go
> overboard (that's inevitable except possibly in
> an autocracy ;) But that does not mean the benchmark
> (in itself) is a failure.
I never implied that ECPerf is a failure. Even in a world where other J2EE benchmarks bloomed, ECPerf will be the queen ( due to its official status ). What I am saying is that existence of more benchmarks, each focussing on a different aspect, can only allow customers to make more informed decisions and vendors to better target specific market segments.
(Note: comments and views in this thread are my own and don't necessarily reflect those of my employers')
I just want to comment on the discussion about using ECperf OR testing your own code/J2EE server.
In my opinion the above is a pretty obvious advice and shouldn't be limited to J2EE and app servers in fact I would expand the advice to say "when developing enterprise applications make sure you build performance and scalability testing into your development cycle early" There is just no substitute for running and sizing your own appplication.
Having said that we know companies that are running ECperf "in house" and building their expertise with the J2EE platform and with different application servers. Also as I have mentioned before ECperf is designed to drive J2EE product improvements and I believe results are starting to demonstrate this clearly.Ecperf is also designed to give end users useful information about application server performace which again I believe it does.
So yes test your own application, compare J2EE products with ECperf, consider ECperf published figures, use all the tools available to make your enterprise development sucessfull!