Jim Waldo in his weblog attacks the common wisdom that that the right approach to all problems is to use a standard, saying that it is curtailing innovation and rewarding bad behavior. Jim states that committee-invented standards rarely survive, but that successful standards were first de-facto standards that were later described (not invented) by the standards bodies.
Read Why standards?
Some might argue that J2EE is a set of standards, but yet it has fared pretty well hasn't it?
That's a very interesting article. An immediate example I can think of is the JDO vs. Hibernate debate. Just because JDO is a "standard" doesn't mean it will necessarily solve your prolems better.
An interesting article with a misleading "light-a-fuse" title. "Why Standards" would indicate that the article is an attack on using standards, whereas it is in fact an attack on Standards leading innovation, as opposed to coming later which is indeed a good point.
It has to be said that standards have a profound effect on an industry, namely that they pave the way for mass-production. Take the humble screw- 150 years ago when you bought something from producer X that was held together by screws, you locked yourself into producer X, because only he made the same screws. Who now would argue that the standardisation of screws has not been a success? Who now would care that there was a slightly superior type of screw that was not adopted as the standard and left behind?
To a large extent it just does not matter which standard is BEST, because the effect that having standards on an industry transforms it so completely - they commoditize that which makes up the standard.
As for standards being defined before a proper use for them has been found - I agree that this is largely a futile exercise. But this must also be tempered slightly. For example, the problem of bridging the Object-Relational gap has been around for a long while; JDO is a "new" standard leading innovation, as described in the article. But that does not make it a bad standard ipso facto, as the details of the problem are so well-known and understood.
This article should make it clear that standards cannot be defined for little-understood problems. I think Web-Services would follow as a much better example of standards preceding innovation and understanding.
My viewpoint is subtly different. I think the standards that succeed are the ones that have implementations that have been extensively tested in the field. These sorts of standards have real implementation experience to back up the theory.
Certainly this will apply to de-facto standards, but it can apply elsewhere as well. The ANSI/ISO C commitee was pretty political, but they did a decent job of ratifying mostly things that had been in used in real compilers for some time. Where things got more problematic was in dealing with brand new concepts that hadn't seen extensive support in the field (like wide characters, trigraphs and the like).
The worst standards are the ones that are based on talk, not real experience, like the OSI (Spelling?) network stack. It seemed real pretty and "modular", and it just didn't work in real life, and TCP/IP with Ethernet ate its lunch.
Personally, I think we need standards, if only so we don't have go re-writing code constantly and re-learning the same old stuff with different syntax/semantics. Standards give us a solid base to stand on, and common ground to communicate. But at the same time - people shouldn't be afraid to go past them when the situation dictates you should. No standard is perfect, and you may be dealing with a crappy standard, or be branching into a new area that a standard hasn't considered. That's how new "de-facto" standards arise - a new but increasingly common need arises that isn't covered by a standard, a mess of implementations pop up to address it, and slowly over time a few stars pop out and become enshrined. When it works out well, one or two of those become truly standardized and available in many environments. When it goes sour, it never truly standardizes and the common de-facto approach gets fragmented across many slightly incompatible implementations.
In summary - IMHO standards are good, because they reign in the chaos. But a competitive market place, and an intelligent way to gel new defacto solutions eventually into standards, keeps us from going stale. This is what the J2EE/Java market has going for it - it's rich with standards, but there's heavy competition and alot of pushing the boundaries at the same time.
A good real-world case study of premature standardization is W3C XForms. I had a discussion back in April with the XFroms community and spec leads on the www-forms at w3 dot org mailing list that you might wonna check out @ http://lists.w3.org/Archives/Public/www-forms/2003Apr/thread.html
See the threads entitled "Welcome to the Real-World; The Future of XForms"
and "The Devil of Good is Perfect"
, for example.
Another good real-world case study using the "build it first and standarize later" approach is XUL (XML User Interface Language). Innovation using XML to build UIs is florishing and slowy a XUL community is emerging. For getting started with XUL check out the Richmond Post Link-opida online @ http://luxor-xul.sourceforge.net/post/links.html
and see a replay of the original HTML story in the upcoming XUL browser wars and standardization drive and more.
First there are only degrees of standardization -no absolute standards.
The truth is -you can't achieve continual improvement and standardization at the same instance.
However, good design, good infrastructure, good strategy, all depend on making the trade-off between what parts can be "standard" and what needs to be "non-standard" or "innovative". So even though standards restricts "where" innovation may exist, the power they provide is well worth the associated incovenience and effort.
Good article get to the real issue that standards can not come before innovation.
The way standards evolve is that someone solves a problem really well, others follow, and next thing you know a standard is born.
Another good point was that using standards for no other reason then they are a standard is a poor strategy, and there are lots of standards left out to die that go nowhere.
I feel your pain. The key point is recognizing a weak versus strong standard. A marketing, committee, or defacto standard can be weak or strong. Time tells a hero.
I did some work developing an XDoclet module that would generate SQL DDL from xjavadoc metadata. XDoclet tends to go the other way of generating web and EJB container source and descriptors. After trying to homogenize MySql, PostgreSql, MS-SqlServer, and Oracle and doing some homework on the ANSI SQL89, SQL92, and SQL99 spec I realized what a weak spec ANSI was and how much power the vendors had. The XDoclet team would not have been able to make as much progress developing cross-vendor code generators had it not been for the very strong J2EE spec. The J2EE spec has done what CORBRA advertised - allow vendors to compete on quality of service instead of embrace and extend.
To me the real culprit is marketing. Most great technologies don't have good marketing. Open source projects have religious followings but it hard to make a buck off them and without marketing they lack the velocity to break into buzzword orbit. Technologies like Ant and JUnit are making inroads. Look at Apache, the Switzerland of Java world. IBM and Sun entrusted them with standards (a.k.a. ref apps) like Xerces and Tomcat and both have flourished. Would the Mac be back had they not embraced standards.
I don't want to go back to the feudalism of the Unix lords nor pay tax to the Dark Lord of Redmond. So I am told, the others have only one standard - .Net.
It's always nice when someone else writes your thoughts. Especially when you felt alone with them :-)
I think the author has a good sense of what a lot of IT-professionals think about standards. He's able to show that not all standards provide huge benefits while some do.
In the business process management area, this problem poses very clearly. None of the specifications like WfMC's XPDL, ebXML, BPMI's BPML, BPEL4WS, perti-nets really took off. Although I'm happy with the JSR207
-initiative, I hope this will not be yet another workflow-spec. I'm convinced this initiative is different because the approach is to cover the common denominator of all specs and tools. We'll do our best to turn this spec into a 'de facto' standard instead of a 'de jure' one :-)
Founder of http://jbpm.org
Independant member of the JSR207
While it is true that a successful previous implementation of a standard guarantees that the standard is useful and not just an intellectual curiosity, it doesn't necessarily mean that the standard will be successful.
The key thing that determines the success of a standard is if the standard becomes more important than the original implementation.
In the case of Unix, X-Window, POSIX, J2EE, C, and C++, the reference implementations were (mostly) created before the standards, but once the standards were created the reference implementations became one of several implementations and did not hold any more influence or importance than the subsequent implementations.
In the case of .NET, the reference implementation was created before the standardization, yet the ECMA standard really isn't as important as the original Microsoft implementation. Look at the main ECMA standards today: Mono, Halcyon, and Portable.NET. While they all claim to implement ECMA .NET, the features they emphasize (ASP.NET, ADO.NET, and to a certain extent WinForms) have not been standardized. Features like templates have not been standardized yet but (at least the Mono team) is committed to implementing them when Microsoft implements them. They won't wait for an ECMA standard. If you browse through the Mono mailing list (and I believe Portable.NET mailing list although I haven't read it recently so it may have changed), the attitude is "If the ECMA standard and Microsoft standard disagree on something, implement it the Microsoft way".
Essentially, these sorts of standards are mostly meaningless, so why bother standardizing?
Just so you don't think I'm picking on .NET, you'd get the same sort of situation of you tried to standardize almost any quickly changing reference implementation like PHP.
Unless the reference implementation is committed to staying close to the standards, and the standards are implemented quick enough so that any popular new innovation is standardized, the standard becomes less important than a specific implementation and the whole thing becomes little more than window dressing.
Relating all this to Java, one can ask: Is the JCP standardization process agile enough and does it have the mindshare and reference implementation commitment to stay relevant? For the most part, yes. Even open source extensions (which tend to be quite innovative) like Struts, XDoclet, and Velocity provide integration with JCP standardized like JSP and in the case of Struts and XDoclet. These teams appear ready to adapt their implementations to future JCP standards like Java Server Faces and Java metadata.
But in some sense, the process is too heavy. Take J2EE, for instance. It has a lot of good technologies but because the specification is mostly released as one monolithic chunk, it takes a long time for it to be standardized. Long standardization times increase the chances that implementations create their own defacto standard that may not be very vendor neutral. .NET's rapid evolution outsides the ECMA standards may increase the chances of this happening.
If each key technology were standardized independently, standardization of quickly innovated parts could happen quicker. You'd still need an umbrella J2EE JCP to make sense of it all, but chances are that because each sub-JCP already has field implementations, it'll continue to be the only relevant standard.