News Standards: A Rising Tide of Commoditization
May 5, 2004
The News Standards Summit took place late last year (December, 2003) in Philadelphia. It examined in some detail a number of standards in the news industry. It led to an open forum on, as described in the agenda, "how to achieve hassle free news exchange". The goal of the meeting was, according to the summit web site, "to increase understanding and drive practical, productive convergence". The view being, apparently, that there are too many standards for representing news and that this presents an obstacle to the aforementioned hassle free exchange.
So what is hassle-free exchange, and how would you know if you had it? More importantly, perhaps, who would be the beneficiaries of such a reduction in hassle, and who would be the losers? In the complex world of news distribution there are a lot of models for how content gets from the original creator to a consumer -- the sometimes intricate delivery chain takes in aggregators, syndicators, archives and more. To examine who is causing hassle to whom, let's take a couple of examples: the aggregator who collects news from lots of different sources and the syndicator who sends news to lots of different destinations.
In his keynote presentation, Chet Ensign, the Director of Architecture and Development Services of Lexis Nexis, quoted a budgetary figure of 6-8 weeks to add a new source to the Lexis Nexis aggregation system. This is clearly quite a considerable hassle, which he feels that News Standards can ameliorate. Mr. Ensign was not explicit about the causes of this lengthy integration cycle, but from what he said, it is not so much that the existing news standards overlap, it is more some combination of the following:
- suppliers of news do not conform to the available standards;
- suppliers use the standards, but incorrectly or with a wide variation in style;
- the standards do not significantly simplify the job of implementing a new incoming feed.
There is obviously quite a difference between these alternatives with quite different outcomes for the standardization process.
Several speakers spoke of their frustration at their lack of success at getting their customers to upgrade from legacy formats. The customers, they say, are happy with receiving the data in the format in which it arrives today and have no interest in receiving it in any other format, believing that such an "upgrade" would be cost, without benefit.
There are quite a few reasons why maintaining legacy delivery formats could represent an undue cost and burden for the syndicator. However, upgrading the output streams to a common format would also involve cost, which would either have to be recovered by reduced operational overheads or by the ability to gain increased revenue from the target customers. Since the target customers say they are happy with the way things are, one has to question how likely it is that increased revenues could be achieved as a result of causing customers hassle (i.e. cost) as a result of changing the delivery format.
What seems to be missing here is a failure to communicate to the customers the benefits that are proposed as a result of introducing new standards. If new formats bring benefits to customers, then surely customers can be sold those benefits? If they do not, then what indeed is the point of investment in a new standard that does no more than the old one?
It is also worth asking why, beyond reducing costs, it would be to the benefit of either aggregators or syndicators to promote standardization. One could make the point that the reason for existence of such intermediaries is that they are supposed to add value by interposing between the source and the destination of news. One important piece of value that they contribute is format normalization.
Historically, part of the value of aggregators was that they provided network connectivity, dial-in modems and so on, to support on-line access to content. Hence it was said that the arrival of the Internet heralded the demise of the aggregator, since the Internet allowed sources of news and their potential customers to reach each other directly, without using aggregators' network services.
In fact, the announcement of the death of the aggregator was premature; although providing network connectivity to content was an important part of their value proposition it turned out that raw access to content, while necessary, is by no means sufficient for most potential customers. Among other things such customers need common access formats across all content sources; something that organizations like Lexis Nexis and other aggregators like Factiva do consummately well is collect news from a lot of disparate sources and make it retrievable with a common interface. To do this remains a challenge beyond the scope of most organizations, hence represents a significant value contribution.
But now replace the difficulty of handling multiple feeds with feeds in a uniform format, and replace the multitude of formats customers want to receive their data in with the same single format, and what have you got? Well, disintermediation, perhaps. But haven't we heard all this before? This time, the claim is that rather than the Internet, it is news standards that herald the demise of the aggregator. Perhaps not, but the rising tide of commoditization that has demeaned the premium value of simply providing access to content will also demean the value of format normalization. Those whose value contribution depends on it will have to climb one rung higher on the value ladder to avoid their feet getting wet.
So where have we got to? The same standards that reduce the hassle for aggregators in handling their input feeds actually reduce the value of those aggregators' existing services. Those same standards, which the syndicators would like their customers to adopt, reduce the value of those syndicators and place the whole distribution chain in jeopardy by allowing at least a technical level of connection between producers and consumers.
The distribution chain is, of course, quite adept at reinventing itself and parts of itself to remain in business. And from its point of view, things don't look all that bad. There are a handful of news standards, invented by the industry itself. They have been adopted to some degree by the participants in the few standardization groups that industry members belong to. But not to any significant degree thus far. There is no significant danger that widespread disintermediation is on the horizon as a result of those standards.
What about RSS though? Here is a great example of a standard that is easily implemented by both sources and consumers and is achieving significant uptake. Although RSS does not present a complete solution to news distribution (it just makes it easier to do simple things than earlier alternatives), it would be very unwise to assume that it will not cause massive changes to the industry.
According to Clayton Christensen in The Innovators Dilemma, one of the hallmarks of a disruptive technology is that while it does not initially match the incumbent technology on a feature by feature basis, it meets some other, hitherto unsatisfied, needs. Where the disruptive technology finds a widespread use, as clearly RSS has, there is plenty of scope for it to grow to fulfill the needs that the established standards satisfy. On it the other hand it is very difficult for the incumbent to downscale to challenge the newcomer.
So what to do? If you believe, as David Megginson stated at the News Summit, that RSS is here to stay and will develop to become an even more widespread information dissemination mechanism, then it's time to take it seriously. RSS has some problems for large scale information providers. In the early days so did web technology. In particular the Web came with no particular commercial model or means of generating a revenue stream. After an admittedly shaky start, money is now being made out of web presentation. And according to this view, just as web technology grew and adapted to overcome the initial technology limitations and the problems of fitting a commercial model round it, so will RSS.
But RSS is still just the basics. If customers and providers are to be persuaded to invest in new technology, then they must understand that they are getting more by doing so than they were getting before. If aggregators and syndicators are to stay above the rising tide then they must see this as an opportunity not just to reduce costs, but to increase the value of their offerings. Standards organizations too, must show that the cost of membership and participation in their products and methods are in tune with their members objectives and produce more for their members than the organic and seemingly chaotic, but effective, approach espoused by the RSS communities.
There are plenty of decisions to be made along the way. Which standards organizations to participate in, and how? What does the organization hope to gain, or to avoid loosing, and given the dynamic and evolving nature of events how to remain abreast of the plot? What are the necessary and desirable technical characteristics to look for in standards, and how to influence their development? These are hard questions, which are looked at in a subsequent paper. But while the problems are hard they are better faced than ignored.
When facing a rising tide is it better to get in a boat than climb one further rung up a ladder.