Sustainability and copyright, JavaScript’s dirty dozen, virtual worlds reboot


Some commentary on a few Sustainable Virtual Design topics I’ve been thinking of recently…

Sustainability and Intellectual Property

One “big picture” area where sustainability and current practice in the design industry may conflict is in intellectual property. I realized this when reviewing some ideas on how to handle webfonts for improved WPO scores.

In theory, we can see where there might be a potential conflict. One of the mantras of sustainability in all professions and disciplines is “reduce, reuse, recycle.” The second term implies using resources more than once, and sharing resources.

So, consider the current scene with webfonts.

Here’s a good article on how you can optimize webfont delivery:

http://www.igvita.com/2012/09/12/web-fonts-performance-making-pretty-fast/

The article recommends the following techniques:

But here’s the problem…many of these techniques would violate the copyright, and your rights to use the font (or really the typeface) intellectual property? In most cases, the copyright on a typeface does not allow you to modify it – and that means stripping out “unneeded” glyphs to improve efficiency. In other words, getting greater efficiency on a website might result in breaking copyright law. Webfonts issued under “webby” licenses like the Open Font License or  GNU don’t have this restriction. Other licenses don’t allow the sort of conversion to alternate formats performed on Font Squirrel – WOFF is generally OK, but formats like EOT or SVG are not. Individual fonts may have additional EULAs (End User License Agreements) that you have to follow.

This is part of a larger issue with intellectual property. “Old” media worked in the physical world, where copies were more difficult to generate than they are in cyberspace. Often, a copyright was secured by forcing the user to buy something physical. If I buy a music CD, I don’t buy the songs, which are copyrighted elsewhere. Instead, I just buy some plastic, plus limited rights to use the musical IP.

So, there have been lots of attempts by copyright holders to make the virtual work like the physical. For example, I might download a movie, then have a temporary watch time expire. I then have to re-download the bits. It should be possible (on consumer electronics, at least) to cache the video and just change permissions on the data, but at present this wouldn’t happen. Instead, the network must re-transfer gigabytes of information. This is because, copy-protected media is possible, but in practice, it hasn’t worked well at all.

Even without the computer angle, IP can increase energy, since the goal of IP is to get paid for one’s work. Many copyright rules relate to how many people can view or experience media. So, I might have the right to show a movie to a few friends, but running it at a party would be out – I would have to pay more. This isn’t bad, unless that payment results in more copies of the IP being created. In that case, it increases consumption, and the goal of most “green” thinking is to reduce consumption when possible.

What’s interesting aspect is that the current law wouldn’t stop me from reproducing a typeface by hand, scanning, and using that. The font is protected, but copying the typeface is not. I might have an Arial-like font, but not Arial itself. Is that what protects all the Arial knockoffs on dafont.com? Now, the net effect of lots of royalty-free knockoffs of protected fonts is an increase in downloads of the copycats. In that case, we could say that IP causes a net increase in resource use. Oh, and a “dingbat” font with a custom non-character drawing might be treated differently…

So, font designers in the future might want to consider a clause allowing people to strip glyphs. Since a font, stripped of glyphs, and then shared with others would rightly be unacceptable, this might be best done via Content Delivery Networks (CDNs) like Google web fonts.

Google web fonts has limited ability to restrict the number of glyphs actually downloaded, but, to my knowledge, you can’t just request the 7 specific characters you need for your brand name in a mini font download. As far as I know, you can’t do this with the Adobe Typekit font CDN, which counts page views, to determine how much you are charged on a recurring basis for a font.

Finally, letting a single company develop a monopoly in webfont delivery would help sustainability. If every instance of Arial came from a single source, all your web apps would use that single cached copy. There would be only one download. Multiple vendors ensures multiple downloads, possibly of the same font. The more font CDNs you have, the less you benefit from caching, and sustainability drops.

All these confusing features of font licensing and use could potentially impact the Sustainable Web. If licensing results in more downloads and rendering events, it makes the web less sustainable. Protection of IP must be balanced against the carbon footprint of the Internet.

It’s likely that as sustainability moves into web and game design, there will be more conflicts. IP means ownership, while core sustainability implies the opposite to reduce consumption.

JavaScript’s Dirty Dozen – are they really that dirty?

A great post on the JavaScript downloads that are slowing down the web from WPO company Yottaa. The followng table shows the most common JavaScripts, and their impact on page download and page size:

File Name % of Web Sites Size (Bytes) Time to Last Byte (msec) Samples with > 1 sec. Time to Last Byte (%)
jquery.js 19.7% 65,939 1,039 40.9%
ga.js (Google Analytics) 66.7% 14,888 253 8.6%
plusone.js (Google +1) 9.8% 7,036 823 26.7%
swfobject.js (Flash) 11.5% 6,815 410 7.2%
all.js (Facebook) 12.2% 60,698 363 6.4%
jquery-ui.min.js 3.8% 30,200 495 18.2%
p.json (AddThis) 5.0% 236 506 7.1%
widgets.js (Twitter) 9.5% 23,061 231 3.5%
quant.js (QuantServe) 7.1% 2,320 220 2.5%
conversion.js (Google AdWords) 4.3% 2,467 197 3.0%
count.json (Twitter) 5.1% 497 311 1.7%
show_ads.js (Google AdSense) 4.0% 6,272 196 1.8%

The big culprit (as I have wondered) is Google Analytics. JQuery is creating a user interface, but Google Analytics is just reporting site use.

One could argue that JQuery is needed for a great Ui, but Analytics seems more valuable to the site operators than site users.

One counter to this is that analyzing your website traffic can make you improve your site. In other words, if you feed back insights from Google Analytics into a site redesign, the energy and resources consumed by Analytics are worth it – you get a more sustainable website. But speed isn’t the only thing. The Yotta article just looks at “time to last byte” for different locations, but you can also consider the nature of your audience in evaluating the sustainability of a particular site feature.

According to sustainability frameworks, “inclusive design” is a must – you should try to get the maximum number of people to get information you can use. So, under-served users who you “enable” via polyfills or site redesign count as a sustainability plus.

If you take a look at your GA log, you’ll see reports on the detailed location where each request to your website originated from. For example, I looked at my plyojump.com data from Africa, and found that in Nigeria, local counties were identified, as well as cities of origin. This sort of data could be combined with the Internet Index to create useful measures of the “people first.”

In other words, if your site serves useful information to countries with a low Web Index, you are contributing to sustainability as a whole. This also outweighs any slow performance in delivery of your site to those locations (unless your site is hampering access to other sites over slow networks). This is the Social Sustainability component of Sustainable Virtual Design.

The Web Index from the World Wide Web Founcation

The Web Index gives scores between zero and 100. One possibility for “weighing” your traffic for its sustainability is to multiply your viewers by the (100 – Web Index)/100 of their origin. This would give a big boost if your site supports users in Yemen or Nepal, which have extremely low Web Index scores. Doing the computation would require knowing the user’s country of origin, which is exactly what Google Analytics gives to you.

Convergence in JavaScript library efficiency

Another feature of the Yottaa article is a great chart showing market share for JavaScript libraries versus download speed. There is an astonishing convergence of download time to a single value as a library becomes more popular:

chart shows at low market share, download times vary, while high market share libraries converge to a few hundred milliseconds.
JavaScript download time versus library popularity, from Yottaa.

The Yottaa article says this is due to better efficiency in coding, but this may not be the case. It is also possible that JS libraries with low market share are solving one problem, or too many. The popular libraries become kitchen sinks of the most valued methods, and end up being about the same size, hence the same download. It is entirely possible that a low market share library with a fast download could substitute in many cases for a larger popular library.

We may be seeing developer inertia instead of efficient code. Lots of the most common libraries (e.g. JQuery and Modernizr) combine other libraries (e.g. Sizzle) into their body. We may be seeing the speed of common sub-libraries, rather than real convergence to a limit of efficiency in production code.

Simple things hybrids can do in 2013

In past posts I’ve focused on things freelancers and smaller web dev shops can do to improve sustainability. The reason is that large sites have the resources to hire WPO companies and employ dedicated site and network managers. This allows them to improve sustainability by means not available to smaller groups. WPO company Strangeloop recently posted a bunch of great tips on improving performance. Here is some of the “low hanging fruit” of their “low hanging fruit.”

A post by Laura Swanson on “cleaning” up basic HTML, CSS, and images:

http://dyn.com/how-we-improved-page-speed-by-cleaning-css-html-and-images/

A great post on optimizing social networking buttons on your pages:

http://www.w3-edge.com/weblog/2011/02/optimize-social-media-button-performance/

Something as simple as not watching your case-sensitivity can slow down old models of Internet Explorer:

http://apmblog.compuware.com/2011/09/13/how-case-sensitivity-for-id-and-classname-can-kill-your-page-load-time/

Check to make sure your server is actually GZipping everything:

http://gzipwtf.com/

WordPress Total Cache (great for self-hosted WordPress):

http://wordpress.org/extend/plugins/w3-total-cache/

A re-post of Steve Souders’ rule that 80% of efficiency happens on the front-side, rather than the server-side. This article is one of the reasons my Green Boilerplate project tries to put more work onto the server.

http://www.stevesouders.com/blog/2012/02/10/the-performance-golden-rule/

Here’s Microsoft’s great new site for supporting the ecosystem of old Internet Explorers, as well as moving the web forward to new versions of IE:

http://www.modern.ie/virtualization-tools

Virtual Worlds Reboot

A few years ago, many people were convinced that the web was about to go 3D, and virtual worlds like Second Life were going to be the future. My own opinion was (and still is) that these intefaces won’t catch on until the post-Millennial generation “Plurals” ( the first true digital natives, the oldest is about 8 years old now) become old enough to move out of Club Penguin. The question of virtual worlds is significant for sustainability. The energy needed to run both client and servers for a Massively Multiuser Online Game (MMOG) is much, much higher than that needed for websites. Various estimates place the energy needed to run a single avatar in a virtual world as comparable to the energy use by a real person in a rapidly industrializing country (e.g. Brazil).

So it is interesting that Linden Labs, the creator of Second Life, is launching a new service called DIO. Compared to the old virtual worlds, it adopts a more “social network” strategy to MMOG virtual life.

https://www.dio.com/

The big take-home: the world isn’t 3D in the old sense of a virtual world, cyberpunk virtual reality, etc. This is all to the good – 3D spaces may be an example of “metaphorical” or “skeuomorphic” design that doesn’t work beyond games. Remember Google Lively? Similar idea in 3D. Dio is worth watching as a low-energy alternative to MMOGs, midway between them and standard S/N websites.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s