Menu

High Drama

January 19, 2000

Edd Dumbill


Some problems just won't go away. Last week we saw XML namespaces causing trouble again. This week, another all-too-familiar debate resurfaced on XML-DEV: "How The W3C Should Be Run." Sit back and let the drama unfold....

Act 1: Gone Fishing

In retrospect, perhaps one ought to have been slightly concerned when Simon St. Laurent started a thread asking for XML's "great controversies." St. Laurent presented his hotlist, and asked for more:

So far, I've got:
1. Namespaces
2. Whitespace, Ignorable or Otherwise
3. External Entities
4. External Resource Support in General
5. Choosing Elements vs. Attributes
6. Internal Subsets

I know there are more that we address at least every three months, but I'm forgetting.

As it turned out, he didn't have to wait long. Some good suggestions were forthcoming from battle-scarred XML veterans.

Andrew Layman had this to offer:

Maybe Namespaces should occupy the top five slots. :-)

David Megginson added more to the list, and commented:

Come to think of it, I remember most of these from comp.text.sgml back in 1993 and 1994. We're not very original, are we?

Had it not been for other matters waiting in the wings, folks might have continued with this thread much longer. But a bigger and better argument was brewing.

Act 2: The Empire Strikes Back

In November last year we published a review by David Brownell of Microsoft's MSXML.DLL parser, testing its conformance against the NIST/OASIS XML 1.0 test suite. In the last week Microsoft posted a response to this article, aimed at Microsoft customers concerned about issues raised in Brownell's review. The Microsoft article concludes:

The point with all this is that in the "real world" where you do not assemble contrived test cases, we think you will see much better than 87 or 88 percent of XML files exchanged between Msxml.dll and other parsers actually achieve interoperability. Nonetheless, Brownell's work is important, and 100 percent conformance is a great goal.

Jun Fujisawa forwarded the URL of the article and asked members of the XML-DEV list what they thought of Microsoft's explanations.

The news that Microsoft thinks XML 1.0 conformance is a "great goal" is welcome, but is "much better than 87 or 88 percent" a level of interoperability developers can feel comfortable with? Steve Newcomb thought not:

(Religion alert! Below is a RANT from a person who believes Precise Communication Is A Good and Sacred Thing upon which the Lives and Livelihoods of All Civilized Human Beings Depend.)

OK, we were warned.... Newcomb goes on to discuss (in some detail) precisely what is entailed in interoperability, and to prophesize that the term "interoperable" will be rendered meaningless in the not-too-distant future:

Software vendors basically don't like information to be interoperable, even though it serves the best interests of their customers. One way to attempt to stamp out the whole concept of interoperability is to undermine the meaning of the word "interoperable," so that the concept will afterwards not even have a name. Even without a usable name, though, the concept of real interoperability is not going to go away. No amount of denial or FUD can change the basic business requirements.

Taking a less religious approach was David Brownell, the author of the original conformance article. In one of his responses in this thread, he carefully analyzes the MSXML.DLL parser errors that Microsoft says are up for grabs due to "hotly contested differences of opinion about how to interpret the XML specification." He writes:

[Microsoft] Is <![CDATA[ ]]> the same as ignorable whitespace? We say no.

In this area, as in some others, the XML specification errata need to get updated. This is the test that I called attention to in the review. (It came from some XML-DEV discussions, where Microsoft was silent.)
In the absence of W3C errata ruling out this handling, I can't see a compelling reason for the NIST/OASIS suite to change.

Alas, he may have spoken too soon. Enter James Clark:

Microsoft is right on this one. See

http://www.w3.org/XML/xml-19980210-errata#E28

I have to say I think it's inappropriate for the NIST/OASIS test suite to include controversial cases where there are legitimate differences of opinion on the correct interpretation of the spec that are not resolved by the errata.

This was certainly news to Brownell, who went off and checked the errata:

Hmm, errata updated (previous 2/17/1999) ... YESTERDAY!!

In the limited space available within the XML-Deviant column, it is hard to do everyone justice, but it is fair to say that the updated errata were welcomed by all present. Yet one question hung in the air: how come it had taken over ten months to update the errata?

A terse explanation was provided by John Cowan:

The former mechanism for publishing those errata went *queep* and died, and has only recently been reconstituted.

Queep? A charming explanation, but this is the ground-breaking technology that is revolutionizing the world of web publishing and electronic business, right? Tim Bray attributed the delay to what amounts to the biggest challenge facing the W3C this year: lack of resources:

...it's a symptom of the W3C’s #1 problem, lack of resources. I've been busy helping get XLink done. CMSMCQ's been busy helping get schema done. JeanPa's been busy getting MS Office to do something sensible with XML, a little bird tells me. It was tough to lay any of these tasks aside to put in the errata work.

Act 3: Anything You Can Do...

The scene is now set for the final climax of this week's drama. There are few things more enticing to the XML-DEV regular than a discussion about how the W3C manages its affairs. Lee Anne Phillips opened this "instance" (as David Brownell has it) of the W3C discussion:

With all respect, I think the lack of resources are the fault of the W3C membership policies, which seem designed to strongly discourage individuals and small organizations and businesses from participating in the process....

...While we all appreciate the enormous efforts of the organizational participants in the W3C process, who've done yeoman service trying to juggle activities which might directly advance their careers at their organizational home with the community responsibilities of the standards process, there just might be a better and more open way...

...In one way or another, we're the ones who pay for all this work. Surely a way could be found to ensure that we know what the heck is going on. Even better, we could help in the initial stages rather than waiting in front of the curtain until a team of magicians come out and present us with whatever they think we want and are then either cheered or booed off the stage.

So why does this complaint arise so frequently in the XML community? In the XML world, unlike some other areas of web technology addressed by the W3C, there is a strong contingent of independent, often open-source, developers who are pushing the technology—and standards—forward. They represent a strong body of expertise that has plenty to offer, but mostly they aren't the people put forward by vendors to participate in W3C processes, unless they get to be "invited experts."

The remainder of the debate is perhaps best experienced by reading the thread yourself. To re-run it in these pages would be to create a duplicate archive. However, here are some pearls that really ought to sit somewhere in a "W3C vs. XML-DEV" FAQ:

After The Show Was Over

Got all that? Good. My commiserations if you were one of those trying to ask a non-political question on XML-DEV last week. These things happen in cycles. This wasn't the first time that the W3C's way of doing things has upset people, and it won't be the last. (You can usually detect the aftermath by the volume of "unsubscribe" messages posted to the list!)

The last word this week belongs to David Megginson, who had this to say to those wondering whether they ought to form an "Alternative W3C":

It might be a lot more useful to start by getting an informal group of 2-4 vendors or developers together, publishing a simple, open spec, and providing interoperable software that implements it. If the world needs it, it will jump at it, and then you can hand over the spec to a standards body for formalization; otherwise, the world will simply yawn.

Sound advice.