Discussions

News: XML.com article: A retrospective of XML (so far)

  1. XML.com article: A retrospective of XML (so far) (10 messages)

    In "The More Things Change" on XML.com, Micah Dubinko offers a retrospective of XML and discusses some of the enduring topics of debate in the XML-developer community.

    The conclusion of the article (and the series of articles of which it is a part) is interesting, and yields the basis for the title of the article:
    Are we going in circles? Or just covering the same ground and hashing over the same tired arguments? The answer is, of course, yes, though in a good way. Get a few geeks in the same room (or often, just two) and you've got all the fixin's for a lively debate. Despite what seems at times like bickering, xml-dev provides a useful forum that has helped to shape the opinions of many readers. Thoughtful commentary regularly shows up from those in the best position to render a valuable opinion.

    The list also provides a useful consensus monitor. Is xml:id a good thing? Is binary XML a good thing? What's wrong with Namespaces anyway? When a rough consensus exists, it surfaces eventually. When no consensus exists -- when multiple outlooks are both possible and strongly defensible -- well, that's where the permathreads come from.

    In any event, the mailing list continues to be a useful resource with no signs of slowing down.

    XML is very common in Java enterprise programming, of course - mandated by the specs for deployment and used nearly everywhere for transfer of data between systems. The vast number of frameworks and APIs, however, seems to indicate that XML may be common, but its usage is not commonly agreed-upon (with the exception of SOAP, perhaps, but even SOAP has justifiable detractors).

    What do you think? The article alludes to some common practices being accepted (and some standards bodies - notably, the W3C - being marginalized in the process). Do you think that's happening quickly enough? What issues do you see?

    Threaded Messages (10)

  2. map of XML maze?[ Go to top ]

    Does somebody have link to map XML stack of ‘standards’ similar to Java maze map http://java.sun.com/developer/onlineTraining/new2java/javamap/intro.html ?
  3. Re: map of XML maze?[ Go to top ]

    http://kensall.com/big-picture/bigpix22.html
  4. Re: map of XML maze?[ Go to top ]

    http://kensall.com/big-picture/bigpix22.html
    Yes, that is it!

    I would love to have it as a big banner and wave it every time somebody says that XML is ‘simple’ :)
  5. Re: map of XML maze?[ Go to top ]

    http://kensall.com/big-picture/bigpix22.html
    Yes, that is it!I would love to have it as a big banner and wave it every time somebody says that XML is ‘simple’ :)

    This is an irrelevant argument. XML is simple. The maze simply illustrates many specifications of the uses of XML. No-one would seriously suggest that because, for example, the RSS specification exists, that this makes XML itself any more or less complex.
  6. In my opinion, the major mistake with respect to XML is that it is being used to solve just about anything. Everything must be encoded in XML these days, even RPCs and business data, while XML is merely a presentation language. This is how something such as XML Schema could come about. It introduces a typing system that is not very useful for document mark-up, the only realistic use-case of XML.

    The worst kind of misuse is RPC. Using XML for that is making sure you have the highest possible amount of overhead. The introduction of binary XML is just a way to deal with the symptoms of that. Why would the industry want to standardise on a distributed computing environment, based on XML, in which things such as distributed transactions, security, interface inspection, etc. are completely reinvented? Clearly, the industry had better opportunities in the past to standardise on a distributed computing. In fact they don't really want that. For them the ideal scenario is to work on a new standard for ten years, fail and then start over and make business with the same story again. Therefore, I think XML as a solution for distributed computing is a dead end.

    An interesting innovation in XML Schema, compared to DTDs, is the fact that a content model may be specified in such a way that foreign namespaces are allowed. Those may have their own associated XML Schema. This provides for the modularity of XML vocabularies, which seems to be the direction W3C is taking. There is indeed no reason for having every XML vocabulary define its own way of including another entity, expressing hyperlinks, digitally signing or encrypting XML fragments, etc.

    Namespaces are essential to achieve modularity. It enables multiperspective mark-up in a simple way, reminiscent of the SGML CONCUR feature. The only problem, however, is that the content model of an element must explicitly allow other namespaces. The default should have been the other way around, because now modularity is a co-operative concept in XML.
  7. Conspiracy theories are popular these days therefore…
    It just makes sense for IT industry do things inefficiently because: they can sell more hardware, services, software, bandwidth etc.

    It is makes sense for many corporate developers (consumers if IT industry cr-p) because it keeps them busy and they somebody to blame. It means that they have things to do and get paycheck. Doing things efficiently means eliminations of jobs.

    It does not make sense for some poor souls in pursuit of efficiency, increased productivity, etc. They should be identified and …
  8. Conspiracy theories are popular these days therefore…It just makes sense for IT industry do things inefficiently....

    I would argue that the alternative practice... having a range of arbitrary incompatible text and binary formats is far, far more inefficient. Every now and then I seem to spend a considerable amount of my time decoding legacy formats and writing conversion and porting software. As an example I used to work in chemistry and the variety of file formats that turned up even though they were supposed to be compatible was amazing (PDB is a good example).

    Almost of all this time could have been saved if a standardised OS-independent human-readable text format could have been used with the possibility of validation and which would allow extension to the format without breaking legacy applications. That format is XML.

    What is wrong with using increased bandwith and power to make life easier?
  9. I agree that having some standard is much better than having incompatibility. The point is that there was already an elaborated standard for distributed computing, but it has been as good as abandoned.

    To me it is not necessary that the messages that are exchanged between systems be human-readable, so long as it is possible to obtain a textual representation for it, during development or for monitoring purposes for example.

    Relying on the extensibility of the messages themselves is not general enough to cope with evolution. You can relax cardinality and you can add elements in certain places, but at one point a system should be able anyway to support several interfaces concurrently. I therefore believe this should be the strategy from day one.
  10. Whether there is conspiracy behind it or not I don't know. I can imagine some companies make such agreements. Things like that happen. IT companies will in any case not standardise more than the market expects from them, because the margins on standardised material are much smaller. Standardised software infrastructure quickly becomes a commodity. It is in the proprietary area the real benefits are made.

    As such there is nothing wrong with this. What is not ethical, however, is that the customer is forced each time to move to yet another technology that solves the same problems. So the customer pays, but he doesn't get more functionality or an increase of the abstraction level. According to me Webservices and related stuff are such a case.
  11. What is not ethical, however, is that the customer is forced each time to move to yet another technology that solves the same problems. So the customer pays, but he doesn't get more functionality or an increase of the abstraction level. According to me Webservices and related stuff are such a case.

    Well... if the customers are not forced to pay IT to re-invent the wheel, what are their alternatives?

    They will simply fire most of the IT staff.

    OK, now they can have the choice of outsourcing the works too but at least that's still better alternative than eliminating IT departments/contractors altogether.

    So we must stay in the infinite loop of re-inventing more and more complex stacks of technologies. And we can be relieved from the pressure of creating another hottest revolutionary "thing" every other month.

    But don't feel guilty. The customers usually do the same to their customers anyway.