Maintainable JavaScript and user-sniffing

While I currently buried in developing a “swap green virtual ingredients” database right now, I thought I would add a couple of interesting links. The first is for JavaScript, by blogger Tim Corey:

Following good coding practices for JavaScript is important for web sustainability. This is because it is easy to write very bad JS code, and because of the sheer volume of JavaScript on the web. In fact, some people seem to think of JavaScript as the “web’s assembly language”, and you’re seeing translations libraries, and even a belief that JS will replace markup skills. If not naked JS, then something like CoffeScript. There’s even a move to convert C++ libraries to JavaScript with tools like Enscripten. In this environment, poorly written JS is a problem.

The Codeproject article is useful because it can be used by non codebeasts, in other words, designer-developer hybrids who create so many of the smaller websites out there. It goes through a set of excellent best practices, abstracting from more encyclopedic treatments like Doug Crockford’s JavaScript: The Good Parts, and Maintainable JavaScript by Nicolas Zakas. These books take a while to read and absorb, but almost anyone who has to write <script> in their daily work can apply the basic steps here.

From code maintainability I now move to non-technical features of Sustainable Virtual Design – in this case, User Experience, or Ux. Modern web design applies a user-centric approach – we design with the user in mind, rather than to our preconceived notions. And the amount of information we have on users is become more detailed every day, as shown by this LA Times article:

Popular discussion:

Here’s a link to the original PNAS research study.

The researches looked for several years at the content of Facebook “like” postings. They used “like button” information alone and tried to predict user demographics. The results are astonishing, even without “explicit” likes, one can often predict sexual orientation (~90%!), or whether you smoke or not (based on which bands you listen to).  Unless there is a sudden move to extreme privacy, this information will become a de facto standard for understanding users in a few years. Right now, sites often try to control content delivery based on the user’s clicktrail through the site. Which Facebook providing logins for an ever increasing number of websites, it seems likely that this practice will develop into full-blown “user sniffing.” Content will be ever more tailored to the user, reducing their searching, shorting online time, and thereby indirectly lowering the carbon footprint of the web.

This rosy picture does have a few flaws, however. First, the issue of “virtual stereotyping” is likely to raise its head. While we can compute features of users for most of the people most of the time, the “edge users” are mis-identified – and we all know how stereotpes can harm in the real world. Second, we can ask, from a Sustainable Virtual Design perspective whether we really want to tailor content this tightly. True, in the short run it makes the average user more efficient, and therfore lowers their consumption of Internet resources. But there may be unforseen long-term consequences of putting people in ever-tighter “walled gardens” of self-induced content. We can imagine individuals in a hyper-network world surrounded only by the things they want – contributing to polarization and mis-use of other, non-web resources.

So, I won’t be surprised if we someday have “user sniffing” APIs out there for our apps. The questions is how we use them to be more sustainable.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s