Discussions

News: .NET StockTrader and J2EE Interoperability

  1. .NET StockTrader and J2EE Interoperability (18 messages)

    Microsoft’s Greg Leake recently penned a digest article describing a benchmark the company produced that compared a .NET server setup to an IBM WebSphere server setup. The task at hand was a stock trading application. Leake tells us that he is quite aware of the cynicism that can greet such studies, but that he finds benchmarks can elicit worthwhile comments. On the TSS Interoperability Blog, we've posted a summary PDF of the study. Among the highlights [from the MS point of view]: Windows Communication Foundation running over MSMQ had 67% better throughput than the IBM WebSphere JMS/SIB message queue configuration using EJB/entity beans. Pulling EJB out of the equation and using JDBC yielded a less startling throughput differential of 16.2%. An interesting bit about the benchmark, and a reason why we provide it here on TheServerSide Interoperability Blog is that it may serve to illustrate bi-directional interoperability between .NET and WebSphere using Web services and WCF. The .NET StockTrader application can interoperate with the J2EE services of the WebSphere 6.1 StockTrade application, according to Leake. Conversely, the WebSphere app can use the services of the .NET StockTrader, he said. Greg's posted such benchmarks before; the TSS editors spoke with him in response to try to get the studies to yield more light than heat. Hopefully, the studies are more neutral than they were - while MS "won" the benchmark, they're providing more detail than they did, plus the interoperability factors in. There's a lot here, though. One aspect is that the benchmark states that MS tried to mirror the architectures as closely as possible; however, paradigms tend to not cross platforms very well.
    ... the configurations compared must be equivalent. This means that the applications must produce the exact same functionality and processing behavior in the configurations compared... The .NET StockTrader, while based on .NET and not J2EE, was designed to mirror most of the Trade 6.1 configurations possible with this testing goal in mind...
    It should also be noted that the J2EE application used "standard patterns" for J2EE that don't necessarily yield results that look good in a benchmark: to wit, the data access model looks like EJB 2.1, with CMP entities. While this can perform well, let's just be gentle and say that the record of this kind of architecture "in the wild" hasn't... been very good. However, they also tested a direct mode that didn't use the EJB layers. The benchmark results (starting on page 19 of the full PDF) certainly show that the EJB layers weren't good in comparison to the direct mode, and - remembering that this is an MS benchmark - the .Net solution walks all over the J2EE solution (in most cases -- there are a few areas in which the Websphere solution is faster/better, in the non-persistent message queue processing for an example). It's also faintly amusing that the Websphere solution was run on both Windows and Red Hat Linux -- with Windows having the advantage (in most cases) even there. One last note: the benchmark covers tuning parameters for each server, but doesn't include JVM information. For example, the hardware platforms had 16GB of RAM, but there's no mention of garbage collection tuning, or even heap space tuning - one wonders what would have happened with a tuned VM or (perhaps) different VMs, with a final choice relying on the JVM and garbage collection tuning that performed best. Take a look at it, and tell the community what you think.

    Threaded Messages (18)

  2. As Joe fairly well noted in the editor's writeup, there are a large number of variables still out there on this test. It's definitely a better job of showing what was used in the test. But MS still ends up pitting multiple variables against multiple variables. Again, as written, we don't know what tuning there is. At the very least, we're using different OS's and different database. We're now left to wonder how well the OS was tuned, and how well the test database was tuned. If MS really wanted to test platforms (i.e. .Net vs. J2EE), then they should've tested on a single architecture, single OS, with a single database system. In this case, Windows x86 (or even 64-bit nowadays) with MS SQL Server 2005, and the *ONLY* thing different being .Net vs WebSphere. Honestly, even more so, I would've rather seen .Net vs BEA WebLogic, JBoss, or even Glassfish. And since this is testing web services, how about .Net/WCF against JAX-WS? Eliminate the app server, and focus on the web services stacks then. It's better than what has come out of MS most times, but still ends up comparing a whole fruit basket rather than apples to apples. --Scott
  3. You can download the .NET version of the application but I see nowhere that you can download the J2EE version. Is that the case? Without the Java version, this benchmark is meaningless.
  4. This is at: http://www-306.ibm.com/software/webservers/appserv/was/performance.html The main WebSphere performance site--see link on right hand nav pane. Or directly at: https://www14.software.ibm.com/webapp/iwm/web/preLogin.do?source=trade6 -Greg Leake Microsoft Corp.
  5. Tuning data[ Go to top ]

    Good comments, here is some more info, and some of my opinion as well :-). The full document (~100 pages) at http://msdn.microsoft.com/stocktrader has the detailed tuning data starting at page 89. This includes the heap sizes used for WebSphere on Windows and Linux. After trying various garbage collect modes, we found the default mode for WebSphere was just as good as the others for this app; other apps/workloads may be different. Anyone can freely try other modes, tuning settings etc. If a *specific* setting is suggested that will yield much better results (>5% difference, say), I will happily re-test and re-publish results with this setting--an offer I have made on all past benchmarks with IBM WebSphere I have ever done; but never got a specific tuning setting to try from IBM. However, I can say many customers did replicate the past tests done and have found similar perf differences. they also point out areas we need to do better with .NET; or point out areas they would have implemented J2EE or .NET differently, and this feedback/debate I think is very good. The debate even internally is good, competition is good, and I hope the IBM perf team will renew their focus on Web Service performance, at least to some extent, becuase of this. That's good for customers, J2EE or .NET. And I also plan to continue to focus on interop here, as both platforms are being adopted increasingly, and service-oriented architectures seem to be the ideal foundation to achieve black-box interop across implementation platforms, hardware, etc. As for apples to apples comparisons, this, to me, is feedback that needs some context. The only true 'apples to apples' comparison is one that pits the same two apps, on the same two platforms, against eachother. Who would be interested in such a benchmark--of course the results will be---the same. The nature of benchmarking is, you are always seeking to compare *different* things that accomplish the same result. As long as the result, including functionality and behavior (caching, transactions, etc) are the same---the benchmark is interesting (and fair) becuase it compares *different* variables (as long as these variables are disclosed so people know what is being compared). So, an all-MS/.NET config compared to an all IBM/J2EE config *is* interesting to many people, as this is what IBM consultants or MS consultants would start off recommending for a greenfield customer project. Other comparisons are interesting as well, but you can't do them all in one shot--but customers, with published code and tuning/test script details, can do the ones that interest them. -Greg Leake Microsoft
  6. Hm. This is totally not comparing the typical .NET development patterns v. Java development patterns. At this stage of the game, a typical java web app isn't touching EJBs at all (save maybe for MDBs, the only real useful EJB). Also IBM WebSphere is notorious for being a dog-slow app server. I'd like to see the same comparison using a BEA/JBoss/Spring/Hibernate/POJO stack, which is much more typical and less complex. One other thing - while performance is certainly something to measure, I would want to see a few other comparisons: 1. Scalability - *how* do these two technologies scale up and scale out, and cost comparisons. 2. Developer productivity - how long did it take a typical developer using typical design patterns to implement the various components? 3. Developer satisfaction - how satisfied are the developers after implementation, and how much more maintainable is the code after optimization for the tests? I doubt M$ would want to publish those arguably more valuable stats.
  7. Other comparisons[ Go to top ]

    Would be interested in all the comparisons you mention; the only (relatively) easy one is scale up/scale out since you can actually perform real testing to get the data. As for IBM Trade 6.1, not sure how long it took IBM to build it; or how satisfied their developers are with it. Hard/impossible to determine. I was the only person at MS working on .NET StockTrader 1.0; it took me about 6 weeks to get the core app up and running on .NET/SQL Server in its various modes; then took me a lot longer to build/test the config management stuff which is separate (2 months). Some ideas there that still need polishing, for sure, but I was a one-man team (including benchmark tuning testing on all platforms) on this one, save for the WPF Smart Client done by Metallic. -Greg Leake Microsoft
  8. Re: Other comparisons[ Go to top ]

    This is silly--would you also consider comparing your .NET apps performance to COBOL and other technologies that are long since obsolete? And even then, why choose IBM Websphere 6--which is well known to be underperformant. If you are trying to tell us that IBM write crappy software... well. As someone has already noted, a more reasonable comparison would be against a jpa/spring/java6 stack running on a lightweight container like jetty-plus or tomcat. The JEE 6 platform will roughly specify such stacks in the form of profiles.
  9. Re: Other comparisons[ Go to top ]

    And even then, why choose IBM Websphere 6--which is well known to be underperformant. If you are trying to tell us that IBM write crappy software... well.

    As someone has already noted, a more reasonable comparison would be against a jpa/spring/java6 stack running on a lightweight container like jetty-plus or tomcat.
    Eh...WebSphere performs pretty well on AIX/Power, and that kind of matters when you have scaled out of intelland.
  10. You're right .. here's proof[ Go to top ]

    IBM leading the pack in application server performance ... http://webspherecommunity.blogspot.com/2007/10/websphere-beats-bea-and-oracle.html
  11. Microsoft Response to Andrew's Blog[ Go to top ]

    I responded to Andrew's blog, and this response can be found on Andrew's Blog.....per link in his post above. Thanks, Greg Leake Microsoft Corporation PS: Andrew, sorry for double-post of my response on your Blog; I accidently posted to your JAppServer blog entry, then corrected.
  12. Hm. This is totally not comparing the typical .NET development patterns v. Java development patterns. At this stage of the game, a typical java web app isn't touching EJBs at all (save maybe for MDBs, the only real useful EJB). Also IBM WebSphere is notorious for being a dog-slow app server. I'd like to see the same comparison using a BEA/JBoss/Spring/Hibernate/POJO stack, which is much more typical and less complex.

    One other thing - while performance is certainly something to measure, I would want to see a few other comparisons:

    1. Scalability - *how* do these two technologies scale up and scale out, and cost comparisons.

    2. Developer productivity - how long did it take a typical developer using typical design patterns to implement the various components?

    3. Developer satisfaction - how satisfied are the developers after implementation, and how much more maintainable is the code after optimization for the tests?

    I doubt M$ would want to publish those arguably more valuable stats.
    Not sure about 1. but I can tell you the 2. and 3. go to MS. The productivity is excellent and I enjoy using c#/.net. Many of my colleges (moved java -> .net) say the same. I'm using both jee and .net in projects and like both but I generaly found development with .net more productive (nice language features, more homogeneous toolset).
  13. Microsoft’s Greg Leake recently penned a digest article describing a benchmark the company produced that compared a .NET server setup to an IBM WebSphere server setup.
    Judging by the traffic it attracts, I now think that TSS pays Microsoft for these things ;-) Peace, Cameron Purdy Oracle Coherence: The Java Data Grid
  14. Re: Who is paying for these things?[ Go to top ]

    Microsoft’s Greg Leake recently penned a digest article describing a benchmark the company produced that compared a .NET server setup to an IBM WebSphere server setup.


    Judging by the traffic it attracts, I now think that TSS pays Microsoft for these things ;-)

    Peace,

    Cameron Purdy
    Oracle Coherence: The Java Data Grid
    I feel like I'm reading a book that reminds me of a book I read 5 years ago or so :)
  15. Museum piece?[ Go to top ]

    Yawn...are we still fighting the Java/JEE vs. .NET battle? I thought the world has moved on...
  16. Re: Who is paying for these things?[ Go to top ]

    Microsoft’s Greg Leake recently penned a digest article describing a benchmark the company produced that compared a .NET server setup to an IBM WebSphere server setup.


    Judging by the traffic it attracts, I now think that TSS pays Microsoft for these things ;-)

    Peace,

    Cameron Purdy
    Oracle Coherence: The Java Data Grid


    I feel like I'm reading a book that reminds me of a book I read 5 years ago or so :)
    This just in, Beta is better than VHS.
  17. Well, I can assure you that it is certainly not IBM ;-) Best, Dmitriy GridGain - Grid Computing Made Easy
  18. The funny thing is, Java runs .NET apps better than .NET runs .NET apps. Maybe they should consider re-running the .NET version on a Java server using the mainsoft IL->Bytecode translator...then the .NET version will look even better! LOL. See: http://mainsoft.com/solutions/pdfs/PerformanceStudy.pdf J2EE is the old standard. Java EE is the current standard. I guess M$ is going to milk the whole '.NET is faster than J2EE' thing for all they can. Whatever.
  19. I work on WebSphere Application Server performance, and I wanted to share my thoughts on this article. My blog comments on some of the "findings" of the Microsoft article so you can understand what to take away from the article. http://webspherecommunity.blogspot.com/2007/09/perspective-in-benchmarks-my-thoughts.html --- Andrew Spyker IBM - SOA Runtimes Architect, WebSphere Performance