State judge rules that EULAs cannot restrict benchmarks/reviews

Discussions

News: State judge rules that EULAs cannot restrict benchmarks/reviews

  1. A New York state Supreme Court Justice last week ruled that Network Associates cannot restrict public benchmarks or reviews using its End User License Agreement (EULA). Normally this wouldn't be that big of a deal, but since this has been a restriction applied by virtually all of the J2EE appserver vendors, it could set a precedent allowing anyone to publish J2EE server comparison numbers.

    Read Court: Network Associates can't gag users.

    Now, this was only a state judge, but it might get interesting if it gets upheld in federal court since the case will likely be appealed.

    Jason McKerr
  2. This clause is also in the .NET EULA I believe.
  3. Looking for a way to make it fair[ Go to top ]

    I can see the point of view of the software publishers. There are so many ways to tweak performance in an application server.

    A few publishers reminded of the EULA terms against performance benchmarks when I published The Performance Kit on TheServerSide.

    My first take was very libertarian "That's not fair, I should be able to do whatever I want." But then I thought about what it would be like to be a publisher and how I would prepare the default settings for my app server? So on second thought it feels right to me that I should be careful when publishing performance kits or their ilk.

    I wish there was no EULA restriction and each app server came with a document that began... "Follow these steps to set-up our app server for a performance test..."

    -Frank Cohen, PushToTest.com
  4. IBM's EULA terms on benchmarks[ Go to top ]

    Here is IBM's EULA terms for disclosing benchmarks from WebSphere Application Developer 5. This seems pretty reasonable to me:

    BENCHMARKING: You may disclose the results of any benchmark test of the Program or its subcomponents to any third party provided that you (A) publicly disclose the complete methodology used in the benchmark test (e.g., hardware and software setup, installation procedure and configuration files), (B) perform your benchmark testing running the Program in its Specified Operating Environment using the latest applicable updates, patches and fixes available for the Program from IBM, and (C) follow any and all performance tuning and "best practices" guidance available in the Program's documentation and on IBM's support web sites for the Program. If you publish the results of any benchmark tests for the Program, then notwithstanding anything to the contrary in any agreement between you and IBM, IBM shall have the right to publish the results of benchmark tests with respect to your products provided IBM complies with the requirements of (A), (B) and (C) above in its testing of your products.


    -Frank, PushToTest.com
  5. I think this could really be important from the perspective of vendors, potential buyers, and users. I have found some interesting stuff within the ECPerf (or whatever it's called this days), and I think it's good for vendors to compete on a standardized playing field (at least in terms of the code base).

    However, the perf stuff is limited in scenario. I would love to hear about the use of J2EE and .NET in different scenarios; where things worked and excelled, and where they didn't. This type of anecdotal knowledge, and even more formal testing in different scenarios would likely help buyers a great deal.

    I understand the need for vendors to protect themselves as well. However, it's a restriction on clients. If you want to have better relationships with clients, make things less restrictive and intrusive.

    I'm not saying give away the bank. People who work hard to develop something should and deserve to get paid if it's good. This just might give the clients more knowledge and power when it comes time to decide what's good.

    -Jason McKerr
    Northwest Alliance for Computational Science and Engineering
  6. Maybe now more authorative benchmarks on different J2EE application servers can be published.

    I read the benchmark between .NET and J2EE (and I do not want to discuss that), but the comparisons between the different J2EE servers was interesting.

    I think that it would be a great thing if each Java App Server vendor downloaded PetStore and made it perform on the benchmarked test environment used for the .NET/J2EE comparison using the same test suite.
  7. This will apply to databases too[ Go to top ]

    Database vendors have long forbidden users from publishing benchmarks of their software. It this ruling is upheld, it will mean that databases can be openly compared as well. From a Bruce Momjian interview a year ago, I remember hearing that PostgreSQL was actually much faster than Oracle or SQLServer, though the authors themselves didn't believe at first that it could be that good.

    With the coming PostgreSQL 7.4 release that runs on Windows, a cheap J2EE stack is now complete - JBoss and PostgreSQL. If you don't want to run it on Linux, you can still run it on Windows.
  8. This will apply to databases too[ Go to top ]

    Ganesh is right, and this is a BIG issue. Oracle in particular has been very keen for years on nobody releasing performance numbers on Oracle for Windows, as in all likelihood SQLServer would blow it out of the water.

    It is as if you'd by a car but were not allowed to tell anybody how many miles-per-hour it does!

    Cheers,
        Henrik
        TNGtech