Optimizing your code for speed (sustainability) – the big list

Here’s a big list of code optimizers I’ve collected for reducing the size of download web pages. The most interesting feature, for me, was use of PHP scripts to dynamically shrink and/or compress the page with gzip or deflate for download. There’s more to optimization than choosing the right library!

The main areas a typical web designer/developer could optimize at a technical level include:

  1. Code mimify CSS and JavaScript files
  2. Optimize images
  3. Combine multiple JS and CSS files into one long document (on the server before download)
  4. Optimize for HTTP requests
  5. Optimize on the server
  6. Space out downloads in time
  7. Remove “one size fits all” from boilerplate, and tailor boilerplate to the specific project

Ask yourself:

  • How many of the list members below am I familiar with?
  • How many of these do we use in our current web design and development?

The problem with many of the programs listed below is that they require you work in command-line. I personally like command line, but the reality is that many designers these days don’t know a thing about it. How many Macintosh-based web designers have ever used their Terminal window for anything?

We could “demand” that they learn, but the process at higher level team and workflow is that they won’t. Even if they just design and hand it off to a programmer or website engineer, the design will be less than ideal for the web medium. For that reason, a huge number of medium and small sites out there that are not optimized. It’s a question of the end result, versus what people “should” do.

If such a naive developer wants to optimize, they are usually met with a curt message about getting on Ubuntu. In a sustainable design team, each member has to understand a significant part of what other members do – to ensure that “sustainability” is not just local optimization.  The web has art directors, designers, and programmers all trying to work together. Decisions made in design and art direction will feed back onto optimization, often in a negative way if the means used to optimize are cryptic.

So, the long-term, the industry-wide issue for Sustainable Virtual Design is: get everyone uses these optimizing tools. It may take changes in interface, canned solutions incorporated into popular “boilerplate” downloads, or modules added to popular code editors. Getting more people using this is important – it is not just an issue for site engineers at the 100 top websites by traffic.

With that said, here’s the list of low-level code optimizers and build tools. Some of these will find their way into http://www.greenboilerplate.com, but I haven’t figured out which, and how to automate it for “the rest of us” yet. Clearly, we need some kind of “build”, to avoid just making a one-size-fits-all boilerplate, like we see in HTML5 boilerplate (ant) or Blueprint (ruby).

The trick is:

  • Making it easy enough that those other than hardcore developers actually use it
  • Everyone on the design team can understand the optimization choices at a high level
  • Those choices can be summed into some sort of “energy star” rating for the website build

One might argue that we should “insist” that web design teams hire coders who can apply this stuff perfectly – but we need to look at the larger system. We want to create a system that maximizes the amount of optimization on the entire web. Pushing out boilerplate allowing some optimization is better than a situation where there are a few, highly optimized sites, and a large number of total “bloat” websites.

General Optimization Theory

Stats on optimization over the history of the web – excellent!

Advanced mobile optimization

How to make your mobile website fast – Meetup Group

Pre-Render the DOM for lousy browsers (Progressive Enhancement)

JSDOM – pre-renders on the server (DOM1) using Node.js

Pre-compilers – JavaScript

JSMake Preprocessor (loads files, get dependency order correct)

Juicer (dependency order, figure out which files depend on each other and merge them together, reducing the number of http requests per page view, thus improving performance

CoffeeScript – makes JS look like a standard class-based language. This is “iffy”, since an obsession with making JavaScript look like Java probably contributes to over-elaborate libraries and coding constructs. We may not be adding to overhead with virtual function overhead, but the amount of code needed to express something in OOP versus JS prototyping may be significant.
Pre-compilers – CSS

These tools let us write CSS using branching, which makes for more efficient programming. The results are then compiled out to plain CSS, reducing duplicate rules and clumsy style cascades. However, these compiler-like systems, as far as I can tell on discussion boards, don’t try to do what a “real” compiler would do – they don’t try to rewrite selectors into simpler and faster versions.

For example,  div.myclass is slower than .myclass, since we evaluate RIGHT TO LEFT in css. But SASS and LESS don’t try to do these optimizations (as far as I can tell). However, their use of conditionals can streamline and organize the CSS development process.

SASS – dynamic stylesheet language for CSS, compiles to CSS

LESS – dynamic stylesheet language for CSS, compiles to CSS
Optimizing by using Small JS libraries

Too many developers these days automatically grab a big, one size fits all library, often loading additional modules into the default, This sort of thing has caused CMS systems like Joomla and WordPress to get big and clumsy, and is probably the reason the average amount of JavaScript code per page on the Internet has doubled in the last couple of years. Design teams should confirm that they really need the functionality of a large library. Often, large libraries have much smaller ancestors. If you just need DOM selection by styles, the ancestor of JQuery, Sizzle is only about 2k versus at least 30k for mimnified JQuery.

Microjs Libraries – tiny libraries that do the work of big frameworks

General site builders

These command tools are familiar to those who spend their days looking at a terminal window on their Ubuntu workstation. There’s lots of potential for these tools to optimize website and web apps, but the restriction to command like automatically locks out a large number of people who work daily in HTML, CSS, and JavaScript but didn’t grow up on Unix. Hopefully, a way can be found for these tools to be more widely used. I bet that most people using HTML5 boilerplate don’t even realize that custom builds are possible using Ant (I sure didn’t).




Sprockets (Ruby)

General code Minify

This is pretty techy stuff, but site developers and sophisticated designers should look at these sites to understand the state of Web Performance Optimization (WPO) and how they can incorporate similar ideas into their projects.

Google “Let’s Make the Web Faster”

Web Performance Best Practices

Google Minify – PHP5 script that combines, minifies, and caches JavaScript CSS files

W3Compiler (commercial, Windows)

Compare CSS shrink tools (CSSTidy won)

JavaScript Compressor

Clean CSS – CSS formatter and optimizer


Media Minify

There’s no excuse for this one. Even Art Directors terrified on one line of code popping up on their Illustrator display should know and follow best practice for optimizing code. Why Art Direction? The reason is that sustainability requires Systems Theory thinking – you’re not allowed to hide in your office doing one thing. Instead, you have to understand your neighbors and the context of your work in your larger projects. A visual designer who ignores code is likely to make designs that are hard to implement in code. If they then force the issue with a programmer or front-end developer, a site becomes more complex and bloated simply to represent their myopic vision. Art Directors, we find you guilty!

Image Optimizer

Yahoo Smush-It!

Bjango – test Photoshop files on iOS

Subsetter – reduce file size of webfonts file (remove unneded glyphs)
Sprite Generators

SpriteMe – make CSS sprite

CSS Sprite generator

CSS Sprite generator

JavaScript Minimfy

These days, most JS libraries are compressed to reduce file size. Customs libraries should be compressed as well, leading to three stages of web development – assets, development, and a production “burn” that doesn’t get edited. One place where there is work to do is JS inline in HTML – it should be compressed, but often isn’t.

Google JS Closure Complier

Older JSMin (may not be the most efficient)

Dojo Shrinksafe

JavaScript Minimfy online

Uglify.js Online

Uglify.js download


JavaScript Minimify in Visual Studio

Visual studio – JS Minify

Visual Studio – minify with T4 Template in Visual Studio
JavaScript and CSS squished together

YUI Compressor for JS

Smaller – Mac OSX front-end for YUI Compressor
JavaScript and CSS – dynamic online minify

reducisaurus – a web service for minifying JavaScript and CSS

Minifier – CSS, JS combiner (makes them into one mnifyed file)

Content Delivery Network (CDN) – based mimify and optimize
These tools have you download files like JQuery from the Google or Yahoo! network. There are several reasons to do this. For one, if everyone uses the CDN, chances are, when someone gets to your page, they’ve already got JQuery cached.

Google Page Speed Service
Not to be confused with performance monitor Google Page Speed. This is a content delivery network that (1) fetches scripts from your websites, and stores them at Google, (2) rewrites them for efficiency, and (3) transparently serves the scripts from the Google domain when people go to YOUR web page. To do it, you have to make some adjustments in your DNS records, in particular pointing  DNS CNAME entry to ghs.google.com. Not everyone can do this – you have to request the service from Google

This seems to be a commercial version of the same kind of service. You add a CNAME entry to your DNS records on your website, and Yottaa caches and does the delivery of your scripts



Deferred loading in JavaScript
When you load a web page, your JavaScript gets parsed. Putting scripts at the bottom of the page helps, but does not completely eliminate the problem.

Use DOM methods to load JavaScript
here, you add a ‘bootstrap’ function inline in the <head> of your HTML page. The function uses document.createElement() to add the script dynamically to your web page. Then, you add an addEventListener and/or window.attachEvent for the “onload” or “load” event. I suppose you could do it with the older window.onload = function() method as well. Then, when the onload event fires, you run the inline script. Simple. You can load the script anytime you need. It. There was a lot of cross-site scripting enabling widgets that used this method a few years ago.

You can also load the script dynamically with document.createElement, and set the .async to true. That make the script load  without firing. In that case, you’re just making sure the script isn’t parsed until the page is rendered.


Head.js – load scrips in parallel, defer execution, add CSS styles for old browsers, screen sizes
this one’s an interesting find. It seems to do a lot of what Modernizr does, but you can just use the script loader, which seems to be the main point. It can add a few useful classes for old browsers and especially (interesting) screen sizes. It might be an alternative to doing CSS Media Queries, but I’m guessing the latter is much faster.

Lots of feature detection, plus a script loading module (which I find a little cryptic). It would be interesting to compare it’s performance to head.js above

PHP Compilers (who knew?)

PHP compiler – convert PHP source into an optimized executable. Also allows concatenation of PHP files. .exe files run under Unix OS.

HipHop PHP – converts PHP source into Unix executables (used by Facebook)

There are other PHP compilers that build .exe files, but they all seem to run on Windows only.

Reduce HTTP Requests in Visual Studio

Reducing HTTP requests is one of the most important things you can do to decrease your web download’s footprint, according to Steve Souders, author of the great “High Performance Websites” (O’Reilly). Combining files is a start, but more can be done, as is shown by these links.

Visual Studio – reduce HTTP Requests

Request Reduce – reduces HTTP Requests
Compress HTTP Requests with gzip or deflate

HTTP compression

Apache HTTP compression – good discussion of tradeoff of compression vs. cpu cycles

Setting up compression in the .htaccess file – “…But…On the other hand, if your Website has a lot of traffic, deflate is actually a better option because it requires much less CPU to compress files…”

GZip versus deflate on HTTP compression

ZipRoxy – compress HTTP traffic
Minify at PHP server level

According to Souders, most of the work (~85%) should be done at the client-side level  – HTML, CSS, and JavaScript. However, there is room for optimization at the server level. In addition, there are PHP-based utilities that help you shrink the client-side download.

Google PHP Minify

Google jsshrink – Minify JavaScript via PHP

CSS Crush PHP minimfy, also adds vendor prefixes

CSSTidy – miminify CSS Files

JSMin – PHP and JS Minify

Using JSMin

Reduce HTTP page requests with PHP downloadable script (Doesn’t minimify)

A simple, downloadable script to Minify CSS and JS

Using PHP to merge and minify CSS and Javascript

Using PHP to Minify and optimize – big library

PHP flush() statement
It allows you to send your partially ready HTML response to the browser so that the browser can start fetching components while your backend is busy with the rest of the HTML page. The benefit is mainly seen on busy backends or light frontends.
Server-side browser sniffing with PHP

WURFL browser detection (Commercial server-side sniffing)
General optimization after development

In the long term, sustainability will increase when we have common techniques for monitoring the carbon footprint of websites. We can try to enhance energy efficiency and download times, but the proof that our methods mattered is in the actual download. The links below describe tools that allow this kind of monitoring. Over the long run, use of such tools will allow us to develop a “sustainablity scorecard” for particular optimization techniques – which ones work best, which ones have “crosstalk” with others, which are in widespread use, and so on.

Once again, web designers who have ignored these tools need to incorporate them into the design process at an early stage. Designing a site, then trying to optimize is like designing a Hummer, then demanding that engineers make it get 50 miles to a gallon. Hybrid designer/programmers and communication between designer and programmer are essential for Sustainable Web Design.

Console object (in all modern browsers, allows analysis of the download process)

  • Google Chrome – right-click to “inspect element”
  • Internet Explorer – press F12
  • Firefox – download Firebug plugin
  • Safari – go to “prefences”, toggle “developer tools” on, use the menu that appears

HTTP Watch (very comprehensive, free and commercial versions)

Browserscope – crowdsourced browser performance, also who is HTML5 ready. Run the test on your own browser, and submit data to browserscope. Read very accurate performance stats for browsers due to crowdsourcing
Page download times (as a whole, and for individual page components)

Yahoo! YSlow

Yahoo! YSLow for Node.js

YSlow Rule base

Google Page Speed

Pingdom Tools

WebWait (online)


JSPerf – benchmark scripts

HTTP Compression (make sure your server is Gziping or Deflating your content)

Page download monitored from multiple locations

UPtrends (Commercial, monitoring from multiple locations)

ShowSlow – ongoing monitoring of website performance
Test HTTP request for errors

HTTP, loads, redirects

Run HTTP load tests
Test for HTTP Compression being invoked

Compression check

Verify HTTP Compression is working

Cross-Browsering Testing

Sauce Labs – cloud-based automated browser testing

Flash/AS3 Optimization

Our old friend Flash, currently “down for the count”, but still very popular, especially for commercial banner ads. Due to its widespread application, optimizing Flash is important, even if you aren’t part of the emotional debate about whether Flash is an “energy hog”.

Minimizing CPU cycles stolen by Adobe Flash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.