Slow Sites, Pixel-Fitting, Vector Graphics, and Sustainability “Hacking”


A reminder (from the highly efficient blog of Dan Luu) of how bad the experience of modern websites is for people with slow connections:

https://danluu.com/web-bloat/

In the article, he hits several of the problems with sustainability in modern web dev:

  1. Too much bloat, not media but gobs of client-side “framework to end all frameworks” code, which doesn’t play nice with latencies and high packet loss (typical of many parts of the world outside the US and Europe)
  2. Frameworks that don’t know about slow web pages (e.g. oAuth, which blows up on slow connections)
  3. Frameworks that blow up the page size (e.g. Panda, the one the author mentions using to build a simple HTML table
  4. Dependencies that crash the page, if the timing isn’t right
  5. Code that makes it impossible to navigate content if it fails to load
  6. Webkit builds that force polyfill loads (e.g. Promise) on all browsers, even if they don’t need it
  7. Blocking code that has to load before the rest of the page loads (e.g. Google AMP’s own code!)

This rush to big blobs of JavaScript in modern web pages is partly due to developers ignoring another feature of web sustainability – progressive enhancement and support for “edge” users. Too often, so-called optimization is for an “average” user which ignores the real bandwidth distribution. And this is turn is happening because of the increasing reliance on machine-generated code instead of hand-tooling, e.g. huge CSS classNames generated by frameworks to avoid name collisions.

As the author says, you may get a high reported bandwidth on cable – but that cable signal is being split into tiny slices for all the apartment dwellers in the building. The competition in addition helps to generate greater latency and packet loss.

Here’s a table the author generated for realistic bandwidth in many parts of the world:

badloads

Ironically, the table generation tool, Panda, created a bloatware table, much larger than it needs to be.

A related, low-bandwidth link on the web’s obesity crisis:

http://idlewords.com/talks/website_obesity.htm

On to Optimizing Media…

Smashing Magazine directed me to an outstanding article on the practice of “pixel-fitting” – adjusting rasterized vector graphics so the real vector edges lie on pixel boundaries.

Hand-tooling is better than the typical results from machine anti-aliasing.

https://dcurt.is/pixel-fitting?mc_cid=4c0f7aac40&mc_eid=08a92aaf2d

One technique commonly overlooked is using the Direct Selection Tool in Adobe Photoshop to “nudge” vector shapes and paths so that their main lines lie on pixel boundaries generated during rasterization:

In Photoshop, you should use the Direct Selection Tool to select the vector edges and then nudge them slowly until they fit perfectly to the edges of physical pixels. This only really works to sharpen straight lines and to define the furthest edges of rounded paths.

Tutorial:

http://simplephotoshop.com/photoshop_tools/direct_selectionf.htm

Now, these methods of hand-tooling rasterized vector diagrams to “pixel perfect” display in turn suggest for those doing identity design for the web. In short, you (re)design your vector logo so, at the most common display sizes uses, your rasterized image is pixel-fitted. This is in the spirit of an earlier post suggesting that the carbon footprint of an image should be data reflected back to visual designers in agile workflow.

Both code and media can be optimized by keeping the right mix of “hand tooling” versus automated production. More on that in the next post.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s