HTML 5 APIs (and other) technology relevant to web sustainability

For the past several months I’ve been developing Green Boilerplate, with a placeholder site at The idea is to create a default template for building sites and apps that helps designers and developers embody the principles of Sustainable Virtual Design.  Since sustainability is a meta-design framework (similar to Inclusive Design),  Green Boilerplate has several components:

1. General principles of sustainability, adapted for the web. These include some general principles adapted from sustainability frameworks in other design professions, as well as a “long list” of specific sustainability principles that can be used as guides during development of a site. These principles can go into mission statements or descriptions of projects.

2. Deliverables based on general sustainability principles, including a checklist similar to Jacob Neilsen’s usability principles and heuristics. These provide a format for communicating principles of Sustainable Virtual Design to clients.

3. Code and markup enhancing sustainability, including a “bootstrap” client-server system. This collection of code libraries provides feature-based polyfills using a client+server strategy, allowing a reduction of network traffic and cpu cycles. The goal is to allow smaller design shops (those that don’t have a Site Engineer or Web Performance Optimization or WPO expert) to apply the most important WPO techniques described by WPO teams at Yahoo and Google.

In exploring the third option, I realized that several of the new APIs associated with HTML5 and the new era of advanced web browsers and servers help create more sustainable websites. I’ve listed the ones of special interest below, with a discussion of why they’re relevant to web sustainability.

First, the basic principles. I list them to remind us that sustainability is more than Web Performance Optimization, or WPO.

1. Make meaningful products, Make websites that are have real value, not fashion or tech-tricks

2. Easy design rollback, Iterative or Agile design workflow

3.Source Renewable Materials,  Switch to a “Green” webhost

4. Design products to work in the future,  Implement classic design strategies

5. Design with the user in mind,  Create effective User Experience (UX)

6. Ensure democratic access,  Build accessible, responsive websites, Internet Index score for ISP’s and client’s country

7. Interchangable Parts, Apply standards-based design

8. Minimize energy and resource consumption, Web Performance Optimization (WPO)

9. Don’t corrupt the virtual system,  Search Engine Optimization (SEO), Internet Index (Web Foundation)

First, client and server hardware

Mobile vs. Desktop – The push to mobiles increases sustainability. Mobiles consume a fraction of the power of a desktop. In addition, they are already optimized to “hibernate” and save energy by default. In contrast, many desktops run 24/7 at full power (100-300 watts). If you’re on a mobile, that automatically ups your sustainability score.

Cloud rendering enabled – This is an interesting one. Server bits are generally “greener” than those consumed on the client, since many server farms now incorporate “green” techniques. In contrast, the local utility is unlikely to do so. Also, if the server can cache a rendering and send it to many clients, it is more efficient than having thousands or millions of clients go through the full rendering process. This draws on the fact that cloud servers are more energy efficient, as reported by Google in Google Apps: Energy Efficiency in the Cloud.

Comparison of the advantages of cloud computing versus traditional hosting in terms of energy efficiency
Cloud Computing compared to regular hosting

Browsers like Opera Mini (as opposed to Opera and Opera Mobile), and Amazon’s Kindle Fire might get a positive score for this, as might Silk browsers. However, it will be necessary to demonstrate that the server is actually doing a more efficient job before scoring this “up” for sustainability.

Server hardware and OS – PHP and other server-parsed languages often give access to server variables, including features of server hardware and operating system. This information could be used, in theory, to help compute the overall carbon footprint of delivering a web page.

ISP – Knowing the ISP would be extremely valuable. Some ISPs (e.g. AISO.NET) have highly sustainable practice in their web hosting, while others, (as detailed by a recent article in the New York Times) are very inefficient. At present, services that allow you to discover features of the ISP for a user’s browser and the website server are used by those trying to reduce spam or hacking. In the future, these same services might allow calculating the efficiency of a web request based on the green-ness of the ISP and the network distance between client and server.

Network speed and quality – A slow network is definitely a minus for sustainability. The network has to remain on during the entire transfer, irrelevant of the size of the downloaded page. We can get an idea of network quality and speed by monitoring access to ISPs and providing the results in a database. Some services (e.g.  MaxMind’s GeoIP service) provide information about network speed in their server-side geolocation.  More sophisticated results are provided by OOkla, which has a commercial database of ISP speeds and line quality. They also contribute free and commercial versions of their speed-measuring servers, which could be used to document the quality of your site’s connection to its audience. Finally, they supply a snapshot of world Internet connection speeds at

world map by country of Internet speeds, ranging from very fast connections in Japan and Northern Europe to slow connections in Africa and India
Ookla Net Index

This information could be combined (if data on the ISPs is extensive enough to generate a rough carbon footprint contributed by the network.

At present, figuring out ISP and network features requires “sniffing” the user-agent of the browser, the server (using utilities like PHP’s php_uname() function ), or using a server-side geolocation service like . There may be some value in sniffing the equivalent strings for servers as well.

Web Index –  This ranking of countries  according to the quality of their web access along technical and social dimensions, developed by the World Wide Web Foundation, scores different countries on the quality of their Internet service. The Internet Index includes several factors beyond efficiency, including democratic access to the Internet, as well ask the kind of services the Internet offers in specific countries. This is right up Sustainable Virtual Design factors around access. Here’s a shot from their interactive tool showing the quality of web access worldwide:

Web index interactive tools, showing quality of web access for each country in the world. Better access is blue, lower access is red. Africa has the largest number of countries with a low web index.

If we know the ISP supplying the user’s browser and the web server, we can compute a Web Index for the sustainability of the page load. So getting more freely available data about ISP green-ness, as well as the true location of a web server and browser, is very important.

Browser rendering engine software – Rendering engines for different browsers include WebKit (Safari, Chrome, and many mobiles), Presto (Opera), Gecko (Firefox), and Trident (Internet Explorer). Older mobiles also use often proprietary Java-based (lame) rendering. These rendering engines have been getting more efficient over time, but this increase in efficiency may be overcome (Jevons’ Paradox) by the rise in page complexity.

In recent studies, Microsoft claimed (and did lots of studies to prove) that Internet Explorer 9 and 10 are more energy-efficient at rendering than Chrome or Firefox. Their results seem compelling, and one would expect that IE might be optimized for Windows environments in a way that a third-party browser would not. So, IE (gasp) might get an “up” for sustainability on new mobile platforms. Detect a new version of the Trident rendering engine, score “up” for sustainability.

However, we have to balance this with the millions of copies of old versions of Trident in IE6-8. Unlike other browsers, MSFT hasn’t required upgrades, so there are lots of old rendering engines jumping into inefficient “quirksmode” when they slam into pages not designed for their own quirkyness. If rendering is longer and more complex (as you would expect when the browser is trying to guess how to render your page) it wastes power. If you’re running on an old version of IE, that scores “down” for sustainability. Here’s a link to recent stats from The Next Web:

The Next Web sept 2012 browser stats
The Next Web Sept. 2012 Browser Stats

What you see here is that IE6 still has a significant market share. Rumors of its demise are greatly exaggerated, so developers and designers trying to supply “democratic web access” as part of a Sustainable Virtual Design strategy must provide support. This is even more important for IE8. I’ts likely that we aren’t seeing the true picture there, since large numbers of IE8 users operate hidden behind corporate firewalls and on Intranets. The typical designer/developer answer “screw ’em” can’t apply to sustainable thinking. Deal.

Newer JavaScript engines are very fast (therefore efficient) relative to even the recent past. It wasn’t too long ago that JavaScript was 10 or even 100 times slower than the best interpreters (Google V8, Mozilla SpiderMonkey, Opera’s Carakan) available today. Efficient code can make these new engines run even faster. IE’s JavaScript performance used to be terrible, but the new Chakra JS engine in IE9 and IE10 is comparable (though slower than) other browsers.

This is a double-whammy on sustainability if you’re trying to polyfill an old browser using JavaScript – the JavaScript libraries will run slow and clunky (and therefore consume more power) on the older JavaScript interpreters and make rendering even more inefficient than might be expected. So all old browsers continue to get a “down” for sustainability. New browsers are an “up”, though there are significant differences between them.

Base-64 encoding of imagesBase64 encoding allows you to copy pixels directly into markup or CSS files, instead of linking to an external image file. This is an important feature to figure out on the server side (which would require browser sniffing in a sustainable boilerplate). If you can do inline Base64 encoding of images, particularly small ones, you can greately reduce the number of HTTP requests needed to define a complex web page. A capacity of Base64 inline encoding is definitely an “up” for sustainability.  Base64 CSS background-image styles are even more significant, than those in <img> tags, since these days more and more images appear CSS instead of HTML.

Battery/Battery Level API – One of the new APIs proposed for browsers would allow JavaScript to detect the battery level of the device. This way, the browser could look at window.navigator.battery send simpler web pages if the battery is very low. If is is sensitive enough, measuring battery level can provide some idea of how much juice a browser or web app is consuming. Considering that many apps suck vampire power, this could be significant. In addition, the lack of a battery indicates that the computer is on utility power, a downside for sustainability. One missing feature: “page heal” events.

Speech Input API – this feature supporting speaking rather than typing enhances usability for people with disabilities. It also favors mobile apps over larger computers where a keyboard is small or hard to reach (though I don’t know if I want people surfing the web in their cars by barking out orders to the browser).

SVG support (including inline and CSS) – Scaleable Vector Graphics or SVG have been around for a while, but only recently did most browsers support inline rendering of SVG graphics. Using SVG is valuable, since the image is resolution-independent – you send the same number of bits to a standard display, Retina display, 1″ cellphone screen and 65″ flatscreen TV. This is a huge savings, and SVG is definitely better than creating a set of low and high-resolution bitmaps for your user interface. SVG inline in CSS shares the same advantages. If we think about the kind of images that get put into CSS, they are typically large, low-complexity objects, A finely detailed background image would steal from, rather than enhance, the web page. There is a double “up” if you can put vector graphics into the background, instead of sending large background images, or (GASP) slicing up images in Dreamweaver and making a circa 1999 web page. I know many of you thought that practice was long extinct, but a glance at what Graphic Design students are often taught about the web frequently includes a Dreamweaver-sliced, 1500pixel wide image lifted straight out of Photoshop.

Webfonts support – Webfonts enable “typographic” interfaces, since the design finally has real control over how text will look on their page.  In addition, “dingbat” fonts allow symbols and icons to be incorporated as text. Since webfonts are typically just outline vector graphics, a typographic interface will have the same savings as an SVG -laden page – provided, the designer and developer optimize the downloaded font files by stripping out unneeded glyphs.

Webfont glyphs used as background images – We can make background images in <div> and other layer elements on a web page, but at present, you can’t put a font glyph, styled as a visual element, into a CSS background image declaration. Something for the future?

Efficient low-level protocols – SPDY is a protocol designed to reduce latency on the web, i particular a reduction in page load time of around half its current time. It does this by a few compression tricks (e.g. HTTP headers) and some preloading (e.g. server hints of what to download). Supp0rt for this or similar protocols, if it is widespread, is a plus for sustainability. It is a minus if it converts the Internet into a “walled garden” where some users enjoy inherently faster service than others.

There are more aspects of new browser APIs, as well as Internet service APIs that could end up in a sustainable web boilerplate. As I discover them, I’ll feature them here.

1 Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.