Menu

Scrambling the Equations: Potential Trends in Networking

February 12, 2002

Andy Oram

I believe that changes in digital networking over the next few years will be more radical than a simple, linear evolution, but less radical than a paradigm shift. The basic variables we've been trained to manipulate--the issues of security, bandwidth, availability, and so on--will stay the same, which is why there won't be a paradigm shift for a while. However, the weights we've been assigning to these variables are about to change. We'll have to throw out our current equations and rearrange the variables into new ones.

The peer-to-peer movement confirms my evaluation. A year ago, a lot of people thought peer-to-peer was new. But anyone who seriously and rigorously investigated the issues it raised--such as the security, bandwidth, and availability I just mentioned--found that these issues have been around for along time. The cutting-edge research used by P2P technologies for advanced areas like pseudonymity and distributed-identification systems was going on years before the term "P2P" was invented.

I recently had a chance to consider some potential trends in networking while attending a workshop called Collaborative Computing in Higher Education: Peer-to-Peer and Beyond, where I delivered the keynote speech. While most of the suggestions in this article are my own, I'll mention insights that I gleaned from various speeches. I have also put up a short Weblog about the conference, and the text of my keynote speech.

Note that the following are only potential trends. While I think it would be logical for them to emerge, and sometimes desirable, I have scant confidence in my role as prophet. Interventions by any number of powerful actors could shove events in a different direction.

A New Generation of Networked Filesystems

Most people thought of Napster, Freenet, and Gnutella as just systems for swapping files. But they herald a new approach to distributed filesystems whose commercial applications are already appearing. Over the next few years, you could well find your company migrating away from familiar, distributed filesystems like NSF or Microsoft domains and toward a new generation that is far superior in robustness and flexibility.

The advances found in the new, networked filesystems include:

  • Use of previously wasted space on organizations' PCs instead of a central server.

  • Encryption, which permits files to be stored on arbitrary computers without danger of snooping or tampering. As a side benefit, encryption supports sophisticated access controls.

  • Replication, because people turn off their PCs. Replication also promotes locality of access.

  • Breaking up files, so that multiple pieces can be downloaded simultaneously for faster access.

  • Indexing and naming that are divorced from location.

Certainly, there are advantages to the current systems. Byte for byte, central storage is cheaper. For small documents, at least, retrieval will be faster without the rigmarole of indexing. The newer systems will be appropriate for large organizations and those with worldwide operations. The latter type of organization benefits from locality of access and makes better use of replication because "the sun never sets" on such organizations. (Somebody's PC will always be turned on.)

Devices will Support Scripting

Current predictions about computing include expectations that people will attach devices to their appliances, their clothing, and even (God forbid) their skin. At the peer-to-peer workshop, Intel scientist David Barkai showed an interesting figure positing an "extended Internet" of the future that contains millions of standard computers, hundreds of millions of devices, and billions of sensors.

If devices and sensors are to be widely useful, people will need fine control. Few will be satisfied with a device that provides only a manufacturer's rigid notion of what's appropriate.

Thus, the explosion of devices will also spur a renaissance of scripting languages. Enthusiasts will either port existing languages to the devices (a task that can be made easier by running a standard operating system on the devices) or--if Perl seems a bit bloated for a thermometer--new languages will be developed.

Security Will Focus on the Individual System More Than the Network

Though peer-to-peer systems have not led to security breaches yet (I'm not counting the kinds of Trojan horses that can be downloaded in P2P executables, as with any other program), they legitimately raise concerns. If we are granting more importance to the individual end user, it is natural to harden that end user's computer. Furthermore, the trend toward collaboration involves people in many different groups on different networks, the kind of environment for which Groove, for instance, was designed.

You may work more closely with a fellow professional at one of your suppliers, vendors, or at a company within your industry than with your own accounting department. You may even trust this external collaborator more than your own accounting department. (I don't mean to unfairly pick on accountants here; most are scrupulously honest.) Meanwhile, two speakers at the peer-to-peer conference (David Nicol and David Barkai) mentioned internal fraud as a primary source of security hazards. A firewall can't protect you against crooked coworkers.

Thus, modern business will move toward setting up small spaces for people who really need to know particular things. Firewalls will still be important for standard spoofing attacks, but they won't be as central to security plans as before.

Everyone Will Adopt Encryption and Digital Signatures

The growing focus on peer-to-peer, and on individual responsibility for security within small groups of individuals who know each other, fits perfectly with the "Web of Trust" approach used in informal PKI systems such as PGP (Pretty Good Privacy, a popular program used to encrypt and decrypt email over the Internet).

Various projections about how people will maintain online identities include fully traceable identities (where you are responsible in your real life for everything your online persona does), the opposite concept of complete anonymity, and a kind of balance called pseudonymity, where you can create multiple identities and each is responsible only for its own behavior.

I think people will increasingly opt for something different from all three systems; something more relaxed and natural.

Some application designers I've talked to suggest that online correspondents often want to know just one important thing about you. For instance, if you are negotiating on behalf of a client, the correspondents want some guarantee that the client really has appointed you as its representative. Once you furnish that, you can join their shared space with validated identities and encrypted communications. Rather than big, centralized, massively bureaucratic certificate authorities, a plethora of smaller organizations may develop which understand how to certify particular people in particular dealings.

Perhaps a hierarchy of certificate authorities will develop, with very general-purpose organizations offering careful accounting procedures at the top, and more informal organizations below. However, the more complex a security system gets, the more subject it is to breaches and abuses.

So download PGP and start developing your Web of Trust today. (I admit I have been laggard in this regard.)

Trust Violations Will Emerge As a New category of Crime

Certificate authorities and digital signatures can't prevent every instance of masquerading or of false claims to authority. The Federal Trade Commission is already heavily involved in prosecuting fraud online; this is a new area they will have to tackle.

I think we'll see instances of people manipulating the online authentication systems described in the previous section. But I have confidence that these violations will be rare enough that most people continue to use the systems in confidence, just as we now use banks and credit cards. Furthermore, authorities like the FTC will recognize and learn how to deal swiftly with the various categories of online trust violations.

DNS Will Be Augmented with Flexible Identification Systems

Some peer-to-peer proponents say that DNS will wither away, or at least prove irrelevant for peer-to-peer applications. These people point to its limitations: the cost of getting a name, the artificial limitations on the namespace, the requirement that a system be up all the time, and the legal shenanigans of ICANN and trademark holders.

But DNS remains a wonderfully adaptive system--and a sterling example of distributed computing--with too many advantages to discard. DNS-identified sites should be the core of new identification systems that serve the intermittently connected and mobile user. These systems will have to feature small footprints and near-zero costs (including the computing cost of a look-up), be replicated and widely distributed, and be protected against spoofing and snooping.

The Application Layer of the Internet Is Widening

The top two layers of the classic ISO seven-layer model are getting crowded. Web services in themselves include half a dozen protocols that interact in complicated ways, all theoretically on the top layer. A conversation I had with Ken Klingenstein, director of the Internet2 Middleware Initiative, revealed many interesting efforts there. Not much "middleware" fits comfortably in the ISO scheme.

The bottom layers of the Internet, while they evolve in fruitful ways, seem to have well-defined roles. The upper layers show more volcanic activity. At some point, it may be useful for the IETF or another organization to issue some conceptual papers so that the protocols on which innovators are working can interoperate and enhance each other.

One possible example I mentioned in my own speech is the addition of a new routing layer that would be aware of both the application and the costs of reaching various points in the network. Several P2P applications include their own specialized versions of such a routing layer, and it would seem worthwhile to extract and formalize the protocol.

Downloads Will Go, Streaming Will Come

Despite the frenzy over KaZaA and other file-sharing systems, among both fans and foes, I really don't see much point to downloading large files. DVDs are cheap (even if inflated in price), legal, and easy to transport. I think more entertainment will move to the Internet, but in streaming form.

Proponents of downloading are enthusiastic about creating your own play list. But would you want to create your own play list of songs all day, every day? Wouldn't you prefer some site that offered a format you like, along with regular exposure to new material--something like a radio station, in effect?

So a song is not convenient to market or transport as a discrete entity (unless it's Mahler's Song of the Earth; playing time approximately one hour). Movies are more viable as discrete entities. But in the come-and-go atmosphere of Internet use, just as with television, streaming entertainment is more appropriate. That way, content producers won't worry as much about people copying and trading shows, particularly if they adopt the current model used by commercial TV: produce shows of such low quality that they have no value except as passing diversions.

Conclusion: A Call to Code

My ideas here are not meant as predictions. I don't write for business leaders and I don't tell people where to invest. Rather, I write in order to influence people who develop new technologies. I try to suggest areas where it may be beneficial to direct their programming skills. So have fun with these ideas, and make new worlds happen.