XML.com: XML From the Inside Out
oreilly.comSafari Bookshelf.Conferences.


Tuning AJAX

November 30, 2005

Unless you live under a rock, you've heard about and likely even used AJAX. Asynchronous JavaScript and XML is becoming an increasingly pervasive deployment methodology, which necessitates that people start to both understand how it works and actually consider it more seriously as an enterprise-level development tool. To that end, I will try to illustrate one method of benchmarking your AJAX applications as well as point out some of the major performance pitfalls I have encountered while developing AJAX components and applications.

Web Browser Mechanics

When working with AJAX, it is paramount to understand that not only are there many different ways to update a web page but they also have varying levels of performance between browsers. Equally important to understand is that no matter how you update a web page, when you make changes to its (X)HTML content the parsing and rendering engine in the browser needs to update its internal representation of the page (recalculating the flow and layout) and then render the changes to the browser window -- for complex pages or changes this can take a considerable amount of time. MSDN's DHTML Performance Tips is a good overview of DHTML performance.

Benchmarking Basics

In JavaScript the easiest way to benchmark one's code is by using the Date object. The Date object has a very handy method called getTime(), which returns the number of milliseconds that have passed since January 1, 1970. This is useful for doing any date arithmetic, like timing how quickly your code is running, and it forms the basis from which you can benchmark your AJAX applications. For example:

var date = new Date();
var start = date.getTime();
//    do stuff here.
var end = date.getTime();
//    notify the user
alert(end - start);

The first nuance that developers generally run into here is that the getTime() method returns the date in milliseconds but often either the operation is simply too fast (and maybe not worth worrying about then) or the browser will tend to return factors of ten making anything under 10ms hard to measure. Both this, and the fact that the results tend to vary between tests, means that it is good practice to repeat your tests several times to get an average and standard deviation.

Another limitation of JavaScript benchmarking is that if tests take a considerable amount of time the browser will usually request that the script be aborted. This can be circumvented to some extent by using the setTimeout() function to initiate any loops of code. Using setTimeout() also lets the browser update any changes to the user interface such as debugging or process information. An example using setTimeout() is shown here:

//    global variables
var g_time = [];
var g_iteration = 0;
var g_maxIterations = 0;

function DoTest()
var start = new Date().getTime();
//    do stuff here.
var end = new Date().getTime();
//    save time
g_time.push(end - start);
if (g_iteration > g_maxIterations)

Using getTime() is a good way to go about benchmarking during development or any exploratory work. However, when it comes to testing a product it can be very useful to apply something more substantial such as the Venkman Debugger for Mozilla-based browsers or one of the many available commercial profilers (Tito JavaScript Profiler or Whitefrost JavaScript Profiler).

Before AJAX, it was generally accepted that most tasks performed in JavaScript would be small enough that developers needn't really worry about performance issues. On the other hand, in today's world of AJAX maps and photo sharing this can be a critical mistake. Having said that, I don't expect everyone to run off and scour their JavaScript in search of potential optimizations that will shave a few milliseconds off their "yellow-fade." JavaScript is generally fast enough that you can build your program and after refactoring any obvious bottlenecks it should run fairly quickly.

Pages: 1, 2

Next Pagearrow