Nature, that August scientific journal published for well over a century just carried an article (survey/review of multiple sources) difficult to imagine even a few years ago, on the effect of consumer affluence (read: increased consumption) on the environment.
The article has two big points, which will cause some controversy:
- Affluence is the result of induced demand for more and more “stuff” by advanced societies in the West
- Consumers can choose not to consume, but their attempt to keep consuming via “green” choices generally fails, due to their lack of control over the manufacturing process.
Even more remarkably, the article says that “market forces” won’t work – in other words, no Green New Deal will cause the economy to “grow” while making things sustainable. In fact, the article uses Marxist arguments about production to imply that no capitalist system could ever reduce consumption on its own.
And, most importantly for this blog, the article says that “virtualization” – converting physical products and services to virtual ones, often don’t make things more sustainable. We can’t simply “grow” by replacing physical services with more and more virtual ones.
“While digitalisation is already a key driving force in societal transformation, it has so far led to more consumption and inequality and remained coupled with the indirect use of energy and materials, therefore sustaining resource-intensive and greenhouse-gas growth patterns at the macro-economic level.”
What does this mean for Sustainable Virtual Design? Two things:
- Minimizing energy consumption, in absolute and relevant terms, is critical for virtual products, services, websites, apps, and networks.
- UX design of virtual products should allow users to accomplish more with less. By “less” we mean things like lower-powered hardware, slower Internet, 2D instead of 3D.
In other words, you actually have to make “less” – you can’t just call a virtual product “less”. Fortunately, this is in sync with Inclusive and Accessible design theory. However, it is NOT in sync with adding more and more CPU cycles to accomplish the same thing.
Case in point. Amazon just announced they had developed an “Augmented Reality” app, the “Distance Assistant” to improve social distancing for its employees.
So, what does this app use?
- 3D computations
- Augmented reality computations
- Artificial intelligence
In practice, Distance Assistance consists of a 50-inch monitor, a camera, and a computer combined into a standalone unit which can be deployed anywhere within Amazon’s warehouses there’s a power outlet available. It creates a “magic-mirror-like tool that helps associates see their physical distancing from others” by painting a circle around their feet, as you can see in the video above. A green circle shows they are at a safe distance from others, where as a red circle signals they are too close and should take action to go green.
Let’s talk about how HORRIBLE this is, from a UX and Sustainable Design perspective…
- Lots of electricity and CPU cycles making things look pretty
- Power taken away from the employees to “social distance” on their own, given to the managers running the installation
Problem #1 “Make it Pretty”
In practice, you could create an extremely simple bit of electronics that employees would clip to their clothing. It would just give out a subtle beep, chirp, flash a LED light etc., whenever you were too close to someone else. With such an app, you decide to move – the app increases your understanding (“too close”) and lets you choose yourself.
The design and creation of such a product would be very low – you could probably build it into a earbud.
But what did Amazon do? They built a high-end 3D space, using software that needs 1000s of times the CPU cycles of a simple proximity detector, a big-screen monitor, Artificial Intelligence requiring cloud computing, etc., etc. A waste.
Problem #2 “Remove Agency and Power from Your Employees”
In most cases in the video cases, a manager of some sort is shown by the screen, watching/guarding it. Their purpose can only be to monitor employees via the AR display, and call them out for failure as they pass down the hallway.
The manager can watch the big screen showing the “ideal” workplace (for management, that is). Social distancing is accomplished by a manager telling them what yellow lines to walk on.
This is a beautiful example of “cool tech” subverting a design problem which could easily be solved from (1) a user-centric perspective, and (2) a sustainability perspective. Instead a “tech-utopian” perspective was applied, which almost always reduces the power of people at the bottom, in favor of the top.
Power migrates from user choice to manager yelling at employees, this time for body motions. This is the future?
In the video, the screen is sometimes turned to users. This is better than nothing, but means you make people conform by group shaming. Other people watch you fail the test, and they can be far away from you, and uninvolved. They just get to judge.
In contrast, a simple, personally-worn proximity detector could tell people personally, not relay info to a manager or everyone else in the vicinity. People would have information and the ability to act, without outer groupthink.
After all, most employees will want to “do the right thing.” This system assumes they need to be disciplined.
It would be comforting to conclude that reduction of affluence and consumption would make these kind high-tech, inefficient, anti-UX solutions less likely. But the example here points in the opposite direction.