Test Frameworks for W3C Technologies
December 11, 2002
Writing specifications is great. Implementing them is difficult. Specifications, at least those from the W3C, aim at interoperability and are written with wide acceptance in mind. However, they tend to be difficult to implement. Look at the various implementations of the original HTML specification. Consider how developers often blur the distinction between de facto standards and specifications, especially when they realize that implementation X does not behave as specification Y says it should. There seems to be very good reasons for making it easier to write conformant implementations, which in turn means that testing them must be made easier, too.
The current situation
The big question is why it's so difficult to implement some standards. Political aspects aside, specifications are often very complex documents (look at the current W3C Schema specification), to some degree unintelligible, and difficult to write conformance tests for. Further, realistic use cases don't always exist for use when developing an implementation. Even worse, some implementations are written for a number of specifications. Finally, there seems to be no way to enforce conformance of implementations, which in turn means that conformance is not an issue, which explains a lot about the existence of de facto standards and ad hoc solutions.
YAWG (Yet another Working Group)
The group creating the specifications should be sensitive to this situation. In order to present tools to solve these problems, the W3C has formed the Quality Assurance Working Group. Its goals, among others, are
- To help the W3C in furthering its aim to lead the web to its full potential, paying special respect to its end goals.
- To support Working Groups in writing clear, testable specifications of high quality.
- To support production of Test Suites that are used to check conformance to specifications.
- To describe the process Working Groups should use when writing specifications and producing test suites.
Reading through this list, I realize that it sounds like boring, bureaucratic, and method-centric work. But that is, however, the price that has to be paid to ensure that the goal is met. If the goal is to create a number of unified frameworks to simplify testing of all W3C specifications, and to allow for an easier way to write conformant implementations, then the effort will be proportionally heavy. For a snapshot of existing documents (currently seven), see the list of QA WG framework documents.
A specification need not necessarily follow any particular method when being authored or published in order to be successfully understood or implemented. It has been shown, however, that specifications gain from being written in a structured way and adhering to a set of rules. Some of these rules affect style only, others impose the presence of specific information (authors, disclaimers, conformance sections and the like). The conformance sections are particularly interesting since they contain the rules or norms necessary to validate the various conformance claims made by implementors. Conformance sections need to be streamlined though, especially since the nature of conformance is independent of technical detail.
More interesting effects can be reached using a more "intelligent" authoring mechanism. Fairly simple rules encoded in a DTD can be used so that authors write specifications in a more uniform manner. This DTD could be extended to include markup for test assertions that individual tests could then point back to. It would simplify finding what is being tested (which in turn simplifies interpretation) and also allow for test frameworks to be produced where all features of a specification are exposed, which is vital for advanced specifications. Finally, an interface to the content can be written which can be used to test the specification itself for completeness and whether it exposes the things it should.
Within the W3C there is a DTD widely in use for writing specifications, the XML Spec (discussions of which take place on a public mailing list). It's primarily used to simplify maintenance of specs. It is not uniform or mandatory, however, so a Working Group can decide for itself how to produce its specification. And it's anything but self-describing, as it's primarily intended for producing HTML documents.
Opinions vary as to whether we should adopt a different framework. I think we should. Others claim that the Working Groups carry too heavy a burden in producing the specifications; adding to this would make their lives more difficult. A proper survey should be made on how Working Groups actually behave now, and how they would like to work in the future. The QA WG is thinking along those lines and results will be reported as soon as they emerge.
Tests and test frameworks
Among W3C Working Groups there is no official understanding of what a test or a test framework really is. Some Working Groups have produced a series of tests that they use to show that their implementations are conformant. Others have created public projects in which the test suites are produced. Except for the fact that it is at the Working Group's discretion to decide how to produce a test suite and how advanced to make it, for an implementor it's not easy to use these test suites, as each one has its own particular setup. In addition, some implementors have reported that they cannot use test suites that have been released for reasons of policy or license. The goal should be to allow as many people as possible to test implementations to assure they are conformant. This should be a basic goal for the W3C and, in particular, for its members. When your client asks how much of your web-services oriented framework follows existing standards you should want and be able to tell the truth.
There's much to be gained from using uniform test frameworks. During development, it's easier to track how well your implementation is doing, a uniform test framework means uniform reporting, issue tracking, and the like. It may well lead to shorter development time, but the greatest advantage is that you will know that your weekend-hack or multimillion dollar implementation is conformant, especially if the body that has produced it has gone through the process of specifying the conditions under which this is indeed the case.
There's been a discussion about forming a devoted Test Group within the W3C. Its outreach would be similar to what the W3C TAG is doing in terms of generality, but limited to test issues. Hopefully, I will be able to report back soon, as it is going to be discussed in the near future.
Everyone involved wants to write conformant implementations, for technical or marketing reasons. In order for conformance to mean anything, it needs to be defined. Bear in mind that conformance and certification are not the same thing: "certification" seems to be what marketing people hear when technical people say "conformance". It would be much easier to allocate resources if a case could be made that the extra effort of testing and, thus, conformance is worthwhile. Without making conformance meaningful or trying to enforce it, testing will be seen as an added bonus and will not be as central as it needs to be. Again, end users and consumers are at the losing end, since they won't be able to make sure the implementation they choose is conformant.
Some political issues
Describing a technical solution that claims to solve all problems is only half the truth; in most cases there are also political (or, at least, non-technical) issues involved. In relation to the topics of this article, these are
- Specification format normativity: the XML version of any specification could be used for far more than to produce the published version of the specification. I've mentioned that it could serve as a container of test assertions. Achieving this stage of specification authoring maturity is not that easy, though, especially since there is a debate over whether it is the XML or the HTML version of any specification that is to be considered the normative one.
- Test authoring: Making mandatory the production of test suites for all W3C technologies is imperative. Producing the tests is not a trivial task. You need to be very careful, have lots of time, and some motivation for doing it well. It isn't clear that the companies writing the specifications have thought about the need to produce tests. In a situation where they are told they have to, process issues arise and the inevitable losers are consumers, since products that claim conformance have not been thoroughly tested.
- Specification interpretation: I've spent hours and hours reading and coming to grips with specifications. There seems to be no other tool to use than agreement for reaching a definitive conclusion. Imaginably, this could be simplified using a more structured approach to specification authoring so that people don't spend time arguing over documents they have themselves written.
- Resources: Writing a specification is a tedious and resource-draining task. You have to allocate time, pay for travel and make weekly telephone calls. It's not evident that companies having invested in this will want to invest similar sums in making sure the thing works.
- Conservatism: Companies don't want to change their processes, especially if they've paid large sums of money to develop them. Nor do they want to work in different ways because someone else tells them to. So it's up to users to tell the bodies that specify technologies what they want in technologies.
- Might is right: There's a reason for things not moving forward: eternal battles between big actors. When this gets in the way of producing usable specifications and, in effect, software, it's a dangerous situation. Without any means of control the situation would be static; given some tools like the ones I propose (use cases, test frameworks and so on), it becomes much easier to push specifications in certain directions. After all, it's up to developers, managers and interested people in general to break the status quo.
Having coordinated the public framework around the DOM Test Suites, I've made sure that some of the ideas -- test automation, reference to the specifications, language neutral test descriptions, uniform test running environments, test schemata -- have been applied. This project is carried out in the public domain and all we lack is resources. It is a good example of how a test framework furthers the original motivations for writing an API, particularly interoperability.
There is a need to test conformance to W3C specifications. The industry and the public would benefit from having access to advanced test frameworks. I think developers would spend time helping to write test cases for particular specifications because that would mean exercising control in a constructive manner. With this sort of testing infrastructure in place, building applications around standards would take on a different, most positive and useful meaning.