Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Frontend Engineering

From MIKE2.0 Methodology

Jump to: navigation, search



FrontEnd Engineering refers to an idea for improving web site performance by amending the way that the web server and the browser interacts. Much of the primary research and documentation was carried out at Yahoo, and the basic rules are described on their Exceptional Performance web page.

In the Yahoo research they have found that in many web sites that only 10-20% of the end user experience page load time relates to downloading the HTML page, and that the other 80-90% of the load time for the end user relates to making HTTP requests to pull in the CSS, javascript and images that are included within in the HTML. Therefore when trying to improve performance of end user expereince start by focusing on the way the server delivers content to the browser, and worry less about the very complex back-end performance engineering that will take more effort and deliver potentially less benefit.

FrontEnd Engineering Rules

There are 14 rules:

1. Make Fewer HTTP request

This can have one of the greatest affects on page load time. Techniques to improve achieve this can include use of image maps for navigation bars rather than individual images and css sprites for all those little icons and curved boxes.

2. Use a Content Delivery Network

For very large web sites by moving the content closer to the end user response time can be reduced.

3. Add an Expires Header

If the server doesn't explicitly tell the web browser when the individual piece of content (html, javascript, css, images etc.) becomes stale then there will typically be a request back to the server every time that element is included in the web page. Whilst this may result in the server responding 304 it still impacts performance. By putting an expiry date into the future the browser doesn't bother asking the server for the content. This not only improves browser performance but reduces bandwidth requirements and server load. However, it can cause some issues with regularly updating content because the visitor (or the proxy server) has already cached some content.

4. Compress Content

Compressing the content using GZIP reduces the total weight of the page, and improves performance because there is less data to transfer across the network.

5. Put Stylesheets at the Top

The browser requests included elements such as CSS, images, javascript in the order that it comes across them in the HTML. By putting CSS at the top it is downloaded first, and this will then allow the browser to progressively render the page

6. Put Scripts at the Bottom

When a script is discovered in the HTML the browser stops making any further requests to the server until the script has downloaded. This is because the Javascript could update the html and therefore change what needs to be downloaded. If there are a lot of images in the html and the script is included at the top of the page then the images won't start to load until the javascript has completely downloaded.

7. Avoid CSS Expressions

CSS expressions can be used to overcome some limitations in browser. However, they are nearly continuously being updated because they are automatically tied to event handlers which can be fired when the visitor moves their mouse across the page. In many cases the same functionality can be achieved in javascript.

8. Make Javascript and CSS External

If you think your web visitors are going to view more than one page then putting the javascript and css into external files will improve overall performance. If the javascript or css is embedded into the html then it is downloaded on every page view. When it is put into an external file the download time for page 1 will be slightly higher but after that the css and javascript will be in the browser cache so won't be downloaded again.

9. Reduce DNS Lookups

Every domain name leads to betwen 20-120 millisecond request to find the IP address.

10. Minify Javascript

Minification of Javascript refers to the removal of unneeded white spaces and comments which leads to a reduced download size. It does have the downside of making the javascript a lot harder to maintain and debug, so it's common to maintain both a minified and debug version. To gain even more benefit the javascript can also be obfuscated. This not only removes white space and comments but also shortens the lengths of variables and function names to reduce javascript size even further.

11. Avoid Redirects

Everytime the server has to tell the browser that the item that has been requested has moved it results in a round trip between browser and server. The most common cause of this is a trailing slash on a URL. If the browser asks the server for then there will be a 301 redirect to

12. Remove Duplicate Scripts

This impacts performance in two ways, the script is likely to require downloading twice and also will be interpreted twice.

13. Configure ETags

Entity Tags (ETags) are a unique id sent to the browser for each piece of content, and is used as a signature on further requests to see if the element has changed. This is great in a single web server environment because the browser and server can compare the signature and know that the piece of content hasn't changed. The signature (on both Apache and IIS) comprises some details of the server (Inode) and therefore in a a multiple web server environment if the browser requests exactly the same piece of content but it is served from a different web server it will have a different etag. The server will then think it's new content and pass it back to the browser.

14. Cache AJAX

AJAX requests should be considered in exactly the same way as html. The response can be cached providing the content isn't always changing.

Example of Applying the Rules

The web site has grown in an organic manner, with functionality added to support different features of the Wiki, bookmarking and blogging. As it has grown the weight of the page has grown as each wiki plug-in adds a bit more javascript or css. We're going to start applying the YSlow rules to see what impact they have on a MediaWiki install.

Before Starting

So we can see the progress at various stages, we'll start with a baseline graph of the page weight. As our guinea pig we'll use the Mike 2.0 Methodology page, which is the most popular page on the web site
Screenshot from YSlow plugin of Mike 2.0 Methodology page stats on 12 May 2008

The left hand side of the screenshot are the stats for how the page looks for somebody who has never visited the page before and doesn't have any of the elements from the page in the cache. At 296Kb the full page weight is quite large, and on a broadband connection will probably take about 4-6 seconds to download and render for the visitor. And the 57 http requests mean there are also quite a lot of external stylesheets, css and images also being downloaded to make up the page.

The right hand table and pie chart demonstrates what happens when a visitor is viewing the page for a second time. The number of items the browser wants to check the status on or download hasn't changed significantly at 56 (the creative commons image licence is served with an expires date in 2010, and so the browser doesn't bother asking the server for it - see rule 3). The total page download size is quite a bit smaller at 21.2Kb, and this is because the web server has told the browser that most of the content is unchanged and served a http status code of 304. You can see this most clearly on the css images. Twenty three images were checked and none of them has changed so the download size is 0.

YSlow also has a component that summarises how the page does in terms of applying each of the front end engineering rules. On the 12th May 2008 YSlow rates the Mike2 Methodology page as an F grade overall, with a total mark of 36 out of 100. For each of the rules here's the grading table.

YSlow Grade
1. Make Fewer HTTP Requests F 10 external javascript files, 10 external stylesheets and 23 CSS background images
2. Use a CDN F For the time being we're unlikely to benefit from a CDN, so we'll ignore this recommendation.
3. Add an expires header F With the exception of one image file none of the other page elements has an expiry date
4. GZip Components F There is some limited gzip'ing of the page, but still plenty of benefit from further configuration.
5. Put CSS at the top B The majority of the external stylesheets are in the <head> section of the page, just one is buried in the html.
6. Put Javascript at the bottomDMediawiki extensions are by default placed in the <head>. If anybody knows how to change this behaviour we'll move them to the footer.
7. Avoid CSS expressionsB
8. Make CSS/Javascript Externaln/aThere is a mixture on the page.
9. Reduce DNS lookupsAMost of the content is server from with 1 or 2 files server from other domains.
10. Minify JSFNone of the Javascript is minified.
11. Avoid RedirectsA
12. Remove Duplicate ScriptsA
13. Configure ETAGSFYSlow rates this as an F because it assumes that the content is coming from a server farm.

Applying the Rules

The first rules we're going to apply are rule 3 - adding expiry headers and 4 - compressing content. These will applied by configuring mod_deflate and mod_expires in Apache to compress any html, css and javascript. CSS and Javascript will be set with an expiry date 28 days into the future. Additionally images that are referenced from the css,i.e. related to the skin and not the content, will also have an expiry 28 days in the future.

Mike2 Methodology YSlow Screenshot after applying Rules 3 & 4

After applying these rules the most noticeable changes are

  • the total page download is reduced by 97Kb for first page view.
  • for subsequent page views the page download size hasn't changed much, however the number of server requests to confirm this has been reduced from 56 to 6.
  • the YSlow page mark has also improved to 57.

Using the Rules

The easiest way to start exploring the rules is to use the YSlow plugin on Firefox. It does require the Firebug plugin. You can also try AOL's open source Pagetest tool.

Finding Out More

There are a number of useful places to start finding out more:

Wiki Contributors
Collapse Expand Close