Optimizing Website Page Load Time

I'm working on optimizing the page load time for this website, particularly for the homepage, and am achieving great things!

Converting Dynamic Pages to Static
If this is possible, I highly recommend doing it. I converted 20 pages into static html, which is generated once per day (I kept the original copies of the pages in PHP so that people who install this software can automatically customize them -- though on second thought, that wasn't completely necessary).

Conditional Use of Caching
While I cannot always use a cached version of the homepage, since someone migth log-in,
I can use a cached version if they aren't logged-in. So I detect log-in status, and if
it isn't present I return a cached version. This is especially useful because my site is using webservices, so I avoid the calls to those. If I took it a step further, I'd remove the log-in status box from the homepage - then the page could always be static (and generated once/day with a cron job).

I also conditionally cache the first two pages of my "Browse View" for all my objects. Eg the calendar, resources, people, groups, etc.

Minimizing the Number of Objects
Browsers can generally have only one connection open to a website at a time. So if you reduce the number of connections, you get faster speed. You also reduce packet overhead.

I reduced the number of objects by
1) Replacing three iframes with CSS divs with the overflow:automatic tag. This works perfectly.

2) Reducing the number of images by reusing the same image. I use CSS to put an image in the background and simply change the text that appears in front of it. This reduces the number of images in my navigational tab bar from 10 to 2 (one is a different color to indicate where you are). I also can do this for the "Add Yourself" buttons (going from 5 images to only 1).

3) Merging four of the javascript files into one file. This saved 200-500ms. My navigational menus is SmartMenus and actually uses three files. However the nice guy who developes SmartMenus has created a version that

combines the three files into one
!

Using GZip
I enabled compression of webpages using php, by adding the following first line to my php code:
ob_start("ob_gzhandler");

This compressed my homepage from around 30kb to 8kb. However, the downside is that IE has stopped caching my homepage. I suspect IE will eventually fix it, but as they didn't with version 7, it might take a while.

You can also compress CSS files and Javascript. And there are other methods of compression besides GZip. The most popular way to compress is using apache's mod-gzip (but something called "deflate" might be better if you have apache 2). I haven't had luck with compressing javascript or css yet.

Using Multiple Domains to Host Content
By default your browser only does two connections to the same domain. So I tried to spread my content across multiple domains to increase speed (they were still all on the same shared server). I didn't notice an improvement.

Overall Impact
In general it looks like my homepage load time is falling from 3 seconds to 1.5 seconds (if you do a "page reload"). Or if you have caching turned-off 3.6 seconds to 1.8 seconds. I'm doubling the speed! You can see the result by reloading my homepage

Useful Resources
I'm using the "Load Time Analyzer" 1.5 extension for Firefox to measure load times. Hit the clear button and then "reload". Also the "Web Developer" toolbar is useful (ex. turn off caching, turn off images).

I've also used Microsoft's "Fiddler" which does the best job of showing you what the browser is actually doing (whether it loads cached versions of pages, headers, etc). It only works with Internet Explorer. Does anyone know of a Firefox extension that does this stuff???


Good Article on this topic by a Google Engineer

Another useful resource

Another useful resource is
YSlow - a Firefox extension from Yahoo.

And a recently published book: High Performance Web Sites