XML.com: XML From the Inside Out
oreilly.comSafari Bookshelf.Conferences.

advertisement

TAXI to the Future

March 14, 2001

Network Mapping and XML History

Comment on this article

Interested in TAXI, or got a similar idea?

Talk back to Tim Bray and XML.com .

Since mid-1999, I've been working on network mapping and visualization; as a side-effect, I had to name it ("Visual Net") and found a company -- there's a public showcase for this stuff at map.net.

The Visual Net technology is cool and 3-D and cyberspacey; but from the viewpoint of a web developer the interesting part is the architecture, which is unlike that of traditional web applications. Architectures need names so you can talk about them: I've been calling this one TAXI, and I've spoken about it at several conferences, but this is the first time I've written about it.

There's not much new about TAXI. I'll claim that if you polled the original group of a dozen or so people, led by Jon Bosak, that defined XML 1.0, you'd find out that something like TAXI was what most of us had in mind. As we all know, XML has mostly been used in backend and middleware data interchange, not in front of the user the way its designers intended and the way TAXI does it. It's long past time for the TAXI model to catch on.

A Brief History of Application Architectures

Gen 1

Many years ago, when I was getting into this profession, dinosaurs stomped the earth; they were called mainframes, and they were at the center of so-called serious computing applications. The architecture of these Generation 1 applications is illustrated below.

In Gen 1, all the code and data lived on the mainframe (often in plain files, before the RDBMS took over the world). People who used this kind of system sat in front of a box, a block-mode terminal, of the IBM 3270 family. Today there are still lots of mainframes talking to lots of what they think are 3270's but are actually PCs running emulators -- I see these applications every time I go to my bank.

A block-mode terminal worked something like this: the mainframe sent it a batch of data, and it displayed a screenful to the user; in most cases, this would be a form with fields you could fill in and tab back and forth between. When you were ready, you'd hit the Enter key, and the terminal would transmit what you'd entered back to the mainframe, which would do some computing and then send you another screenful.

Related Books

Table of Contents
Sample chapter

Gen 1 had a lot of advantages. Most important, all the expensive code and data were in one place, in one back room with a locked door, managed by professionals. Of course, it also had some big disadvantages. Most important: all the power and control were in one place, i one back room with a locked door, controlled by professionals -- the kind of professional that appears regularly in Dilbert as "Mordac the Preventer of IT".

There were some secondary disadvantages like no graphics or mouse; niceties like color and upper-lower case text were often seen as optional extras.

You might ask how the computer and terminals sent data back and forth, but you wouldn't get an answer; that was managed in a black box by extremely expensive "communications controllers".

Gen 2

Gen 1 ended with the more-or-less simultaneous arrival of the PC and the RDBMS. Generation 2 was the classic client-server application, illustrated below.

People quickly became addicted to having a their own personal computer; they could print their own documents and do their own budgets without having to go through Mordac the Preventer. After a brief rearguard action, the profession rallied around and built systems where the majority of the business code ran on the PC, and the mainframe-database tandem served as a large, fast, reliable data store.

Gen 2 had the obvious advantages of bringing color, graphics, and local interactivity to end users. It had some pretty severe costs, though: the software, instead of having to run on just one tightly-controlled box, had to be shipped all over the company and was at the mercy of whatever hare-brained (or not) user might have misconfigured his or her personal computer. The total cost of ownership (remember, this was the before the days of sub-$1,000 computers) was pretty high, and the system had a lot of failure points. But there was no going back.

Gen 3

Another problem was that distributing all the business logic to the desktop didn't work very well. Some parts of it -- the collaborative part, where the work of different people had to be coordinated -- really needed to be on the server. The result was the rise of multi-tier application architectures, illustrated below.

This was a lot like pure client-server but with some of the application code, as well as the database, running on the server. Of course, multi-tier systems suffered from the same cost-of-ownership problems that had plagued pure client-server.

Both client-server and multi-tier applications were like old Gen 1 apps in that nobody ever really concerned themselves with what the protocol was between the client and the server; that was private application business, not something that ordinary customers needed to bother their pretty little heads about.

Gen X

Multi-tier (often called n-tier) application architectures were about the state of the art in the early 1990s when the Web came along and rewrote the rules. Let's refer to the web generation as "X", since it was the early 90's. The general architecture is illustrated below.

The crucial new thing is that the whole system is defined by what is sent back and forth between client and server, rather than what the client and server actually are. Specifically, client and server exchange messages using the HTTP protocol; from client to server, the content is a URL, and from server to client, the content is a lot of HTML-encoded text, mixed up with some multimedia in PNG or JPEG or WAV or whatever; HTTP took care of labeling the formats so the client knew what it was getting.

The difference was magic: as long as the client was prepared to deal with a HTML and a bit of multimedia, the server could become truly a black box; nobody cared how much code or how many layers were running on it. Even more important, as long as your desktop box was running a reasonably modern browser, nobody cared whether it was a PC or a Macintosh, and which version.

There was some initial resistance to the web revolution from Mordac and his gang, largely because web interfaces tended to be simple, have very few controls, and all look about the same, removing their ability to express their personalities by writing VB applications with dozens of funny-shaped buttons and bizarre screen layouts, that took weeks-long training courses to learn how to use. But the users voted with their feet for the browser and we got to about where we are now.

Pages: 1, 2

Next Pagearrow