Improving web application performance by parallelizing requests

Kris Geusebroek

For a web application i develop we had a problem with the performance. After a small investigation we found out that it had relations with the amount of requests to the server that were done.

The application is running in a browser (currently IE7) and browsers are generally limited to do not more then 2 parallel request to the same domain.(this has improved a bit in later versions of the browsers). In this post i will describe the quest for solutions.

A simplified description of the application will help to understand the problem better i think. The application is used to track movements of cars around the country so a lot of data is gathered every N seconds with the help of XMLHttpRequest. For every type of car there is a separate request done to the server. Currently about 5 requests are issued every 5 seconds.

There wouldn’t have been a noticeable problem if all the queries had performed well and every request would take less then say 1 second, but one of the queries was performing very slow with a response time of over 20 seconds. This query was used in 2 out of the 5 requests. Also the app was build in a way that if a new request was triggered the current request was aborted. So the problem was that at least one of the 2 parallel requests was never able to deliver an answer and was holding up other requests.

Using sub domains

So even after improving the query performance there was a need to do more then the 2 parallel request at a time to speed things up even more. For normal http requests a widely used solution is to set up sub domains in your DNS server pointing to the same domain which lets the browser believe the request is made to a different domain. This technique is used with Google Maps for example to get the images from multiple domains for a faster user experience.

I wanted to implement the same kind of functionality for my XMLHttpRequests so i went off and implemented a solution to select an sub domain from an array and send the request there. Unfortunately this didn’t work because of the security restrictions on cross domain XMLHttpRequest. Even if i know better then the browser that this sub domain is essentially the same domain the browser was restricting me here. That’s what you get for tricking the browser to do more in parallel right?

Searching a bit more i found all kind of workaround-ish solutions using frames and stuff to do cross domain XMLHttpRequests but the most useful solution was limited to the newer generation of browsers (Firefox 3.5+, IE8 etc.)

Using response header Access-Control-Allow-Origin

Setting the response header Access-control-Allow-Origin to the exact (including http://) value of the origin your web application is loaded from made it possible to do cross domain XMLHttpRequests.The easiest way i implemented this was to load the module headers_module (modules/mod_headers.so) in my Apache http.conf and add the following line to the configuration file:

Header always set Access-Control-Allow-Origin: http://maindomain.sample.org

this made the browser send an OPTION request on the XMLHttpRequest.open call to check if the call was allowed before sending the data with the POST request.

And what for the older browsers?

But he! I’m still running on IE7 right? Right so back to the drawing board.

So currently I’m trying to combine as much of the requests into one request and filter out the data on return. In the meantime I found that for IE you can influence the amount of parallel requests done to the same domain with an adjustment of your registry settings. Since the app is running in a closed and controlled environment this might be the way to go if my programming attempts are unsuccessful.

Comments (8)

  1. Chris Conrey - Reply

    December 10, 2009 at 1:50 am

    Couldn't you also speed this up by optimizing the DB in a few ways to improve the flow from the browser? The Subdomains move works in most instances though as a short term gain.

  2. Hugo - Reply

    December 10, 2009 at 11:41 am

    Hi Kris,

    I realize I know too little about the application but here's my 2 cents worth:
    It seems you have a good view on what future request are going to be. This givens all kinds of caching possibilities.
    * Is it an option to serve the request without going to the database directly? Have some async process doing that for you for instance?
    * Maybe it's possible to have a special table in the database (memory table?) which is just for serving those requests. The data is, in another process, then distributed to the rest of the tables.

  3. Albert - Reply

    December 10, 2009 at 10:19 pm

    Hi Chris,

    Couldn't server-pus/COMET stuff reduce the amount of requests or am I missing the point here...

  4. Kris - Reply

    December 14, 2009 at 9:00 am

    Hi Albert, Hugo and Chris,

    Of course the best thing would be to optimize the performance of the database and that's the first thing we are going to do. But this will still leave the part that we are doing parallel requests and if the optimization is not that great the synchronous way it works now will slow down the user experience.

    So together with this performance boost of the database i also did some searching for the options we have for making more requests at a time.

    Also the Comet/Websocket stuff is very interesting point of view. This kind of architecture is already considered for the next release where we want the user interface to respond to commands from an other application. But we will certainly have a look at this earlier to help us with the overall user experience.

    Cheers Kris

  5. sasoon - Reply

    December 18, 2009 at 7:02 am

    instead of 5 requests every 5 secs, issue one request per 5 secs and retrieve all required data in one chunk.

  6. Website Monitoring Guy - Reply

    December 29, 2009 at 5:28 pm

    Why not use Firefox or Safari for your web application? They don't have 2 request limitations as far as I know.

  7. Kris - Reply

    December 30, 2009 at 1:18 pm

    Hi Website Monitoring Guy,

    I would be so happy if it was possible to use Firefox or any other browser then IE6.
    But also these browsers have the same limitations only with different thresholds and easier to configure.

    I believe IE8 has a threshold of 6 and Firefox a threshold of 8 in the latest versions.

  8. Ray - Reply

    January 17, 2010 at 3:51 pm

    The limitation of 2 parallel requests are due to HTTP 1.1. If you can downgrade it to HTTP 1.0, you will have better parallelism.

Add a Comment