There’s a lot of concern about web page efficiency these days, and some technical breakthroughs (e.g. HTML5) are being partly justified by the degree to which they improve resource utilization on the web. But is the drive for efficiency paradoxically resulting in increased resource use?
Enter Jevons’ Law. Concern about sustainability and energy use, even “peak energy” is not new. In 1864, William Stanley Jevons published a book on the depletion of coal (http://oll.libertyfund.org/index.php?option=com_staticxt&staticfile=show.php%3Ftitle=317&Itemid=27) – a sort of ‘peak coal’ theory, which in fact did predict the year of maximum coal use in the UK. In it, Jevons noticed that major increases in the efficiency of steam engines burning coal did not result in reduced coal use. Rather, the efficiency increase lowered the price of ‘steam energy’, and more companies and individuals used it. The end result was a net increase in coal use, rather than a decrease.
In other words, increases in energy/resource efficiency result in increased depletion of the same energy resources.
In an article on The Oil Drum, Lionel Orford considered a formal definition of “Jevons’ Law” or Jevons’ Paradox. Summarized it is:
“…as the efficiency of a type of machinery is improved, it becomes profitable for many more customers and feasible to apply it to new applications. This results in rapid growth of the number of machines in use and consequently, an increase in fuel consumption overall…”
According to Orford, Jevons’ Law/Paradox is holding up quite well late in the Oil Age. For example, modern fan jet engines are much more efficient than the original turbojets used on commercial airlines. However, this increase in efficiency has caused greater, rather than lesser air traffic, and the result has been an increase, rather than a decrease or growth reduction in consumption of jet fuel.
A similar case can be observed in personal computers. As “Moore’s Law” has operated, there has been an increase in computer speed – and a reduction in the energy needed to process each bit of information. The result was not lowered electricity consumption. Instead, personal computers are everywhere, with their hyper-efficient CPUs grinding away and burning up a significant fraction of total electricity in the world.
There are lots of tools out there for reducing download times of web pages. Google Code has a group, Let’s Make the Web Faster () that features a discussion group on the topic, and tools that can be installed in Chrome.
- Let’s Make the Web Faster – http://code.google.com/speed/index.html
- Google’s (free) caching service to increase speed – http://code.google.com/speed/pss/
- Yahoo! YSlow – http://developer.yahoo.com/yslow/
- Yahoo! Exceptional Performance Team – http://tech.groups.yahoo.com/group/exceptional-performance/
- Browsermob – https://browsermob.com/website-monitoring (uses browser activity, not just HTTP requests)
- Pingdom – http://www.pingdom.com
- Royal Pingdom Blog – http://royal.pingdom.com/
- ShowSlow – http://www.showslow.com/ (combines YSlow, Page Speed and dynaTrace)
The best recent (2010+) historical site seems to be HTTP Archive at http://www.httparchive.org, built from Webpagetest (http://www.webpagetest.org/). In a summary article on Pingdom, new technology has resulted in a 25% size increase in web downloads.
According to an article on the Google Code site (http://code.google.com/speed/articles/web-metrics.html), the average page is now about 1/3 of a megabyte, and uses about 10 HTTP requests to load. In just a few years, the web will be super-sized like SUVs and Costco shoppers!
Here’s a link to “Web Pages are Getting More Bloated” – http://royal.pingdom.com/2011/11/21/web-pages-getting-bloated-here-is-why/
Historical information would be interesting, but this is way more than 10 years ago. Is this a result of efficiency? Did bandwidth speeds and more efficient clients and servers trigger greater, rather than less, energy and resource use on the web?
A related issue is the perceived speed of a web page versus the object download time. In an article at uie.com (http://www.uie.com/articles/download_time/), another efficiency paradox was discovered when users rated the ‘speed’ of a website. Out of 10 major sites, the fasted to download (about.com) was rated as slowest by users. The 10-second rule often cited by web developers and UX designers did not hold – slow sites were rated fast, if users felt they accomplished their tasks.
In other words, great UX might use expensive resources on the web, yet create a ‘feeling’ that the web was fast. This, in turn, would encourage UX designers to create high-energy pages that optimize apparent UX.
There’s a possible law here: UX sustainability DOES NOT EQUAL energy sustainability.
However, “lean UX” – as a design principle within a development team (e.g. using paper prototypes in agile development instead of a waterfall development path cranking out fully realized prototypes) could easily result in greater sustainability of the process of web design – this should be independent of Jevons’ Paradox.