Skip to content

How to save Facebook 100 terabytes and 1.3 trillion http requests

by Toby Somerville on November 2nd, 2010

At the time of writing, Facebook is one of the largest websites in the world and boasts over 500 million users. We wanted to see if we could squeeze any extra performance from it. For this example we looked at Facebook’s home page. We did not test the server side code (obviously), we did however look at the code sent to the client. The optimisation we did does not include any changes to actual the code itself.

Summary

The table below shows a summary of the savings we made:

BeforeAfterSaving
HTTP Requests16115 (27%)
CSS45.5 kb45.0 kb127 bytes (compressed)
Javascript184 kb183.6 kb130 bytes (compressed)
HTML9,718 b9,565 b152 bytes (compressed)
Image compression51.4 kb51.4 kb33bytes

That totals 5 requests and 442 bytes saved. Now that may sound pretty insignificant, but with a popular site like Facebook the actual bandwidth saving is significant. It equates to a saving of over 100 terabytes and 1.3 trillion server requests per month (when scaled up to the 260,000,000,000 page views Facebook recieves a month and assuming that a similar savings could be made throughout the site).

The details

Initial impressions show that as expected, a lot of work has already been done to optimize the page; The text based pages all use GZIP compression and the files are served from a CDN (Content Delivery Network).

Headers

The page has been set so the browser must fetch a new version of the page every time it is called by the client. This is something I find slightly surprising, given that the front page doesn’t change very often. I would have expected them to set a far future expiry date or at least an expiry date of a month in the future, (actually set for the 1 January 2000).

Page Size

The total weight of the page is 127.5 kb (compressed) with 16 HTTP requests. On second loading with a primed cache there is only one HTTP request with a total weight of 9.6 kb (if the expiry date was put to a far future date the page weight would be zero as the whole page could be loaded from the client’s cache [assuming caching is turned on]).

HTML

The page has a doctype of XHTML strict, but does not validate correctly, with 12 errors reported by W3C Markup Validation Service.

Original files: 30,071 bytes (uncompressed) 9,718 bytes (compressed) ~68% compression ratio.
After optimisation: 29,599 bytes which equates to approximately 9,565 bytes compressed, giving a saving of 152 bytes(compressed).

JavaScript

Original files: 184,001 bytes (uncompressed)
After Optimisation: 183,566 bytes (uncompressed)
Saving: 435 bytes(uncompressed) or at 70% compression ~130 bytes

CSS

Original files: 45,511 bytes (uncompressed)
After Optimisation: 45,004 bytes (uncompressed)
Saving: 507bytes (uncompressed) or approximately 127bytes compressed at 75% compression ratio.

HTTP Requests: 16

Overall there are four JavaScript files, three stylesheets, seven images, one favicon and the page itself. Optimisation reduced the number of CSS files and JavaScript files to one each a saving of 5 http requests.

Images/Media: 7

The images within the documents are well compressed. Only one image could be further compressed (without a loss of quality), saving 33 bytes.

Conclusion

As you can see, even in a well optimised site there is usually some scope to squeeze out some more performance. We tried to err on the side of caution with compression ratio percentages (the uncompressed bytes savings are accurate). Also we haven’t looked at optimising by changing the page code. we have purely taken what is there and optimised it, no monkey business. There is further scope to wring even more speed out of the page, but that is beyond what we wanted to show here. Over coming posts we’ll be reviewing other popular sites to see if we can find similar performance gains.

No comments yet

Leave a Reply

You must be logged in to post a comment.