Power, Power, Power for Computers and Surveillance


A couple of examples today, both showing why the amount of power needed to support computing is on track to become a major part of overall energy consumption.

Articles through the techie world are discussing the latest Chinese supercomputers, running in the “exa” range. What’s interesting is the power they consume.

China Has Already Reached Exascale – On Two Separate Systems (nextplatform.com)

The most interesting part of the above article is the phrase “35 megawatt sweet spot.” In other words, the supercomputer network must be using power in this range to sustain its exaflops. That’s a lot of power. If we imagine a typical US home using about 3k of power, that’s 1000 homes. Another way to think of it – a few hundred windmills (of the size typically built on land). For one supercomputing system.

Equally interesting – the power needed to create and “train” so-called “deep learning” networks for “AI,” as described in this article.

Deep Learning’s Computational Cost

https://spectrum.ieee.org/deep-learning-computational-cost

A basic feature of modern neural networks is that they need to be “trained.” Most non-experts imagine the “AI” as a rules-based system – someone coded in a set of rules that the AI follows in making its decision. But the programmers actually build the network, then use a “learning set” to set the weights in the network. The training cycles take a huge amount of power, and more and more power is needed to reduce error. In a very short time, training the network takes the power of a large city for a month(!)

So, to train a high-performance image-recognizing network – not here yet, but the type envisioned for the near future – you shortly require the power needed by large cities with millions of people. And, if the kinds of images changes, you have to re-train the AI to handle the new images out there.

During operation, the AIs use a significant amount of power to do each recognition operation. This article discusses energy use by real deep learning systems, rather than the extrapolation above:It takes a lot of energy for machines to learn – here’s why AI is so power-hungry (theconversation.com)

Some energy measurement tools for the operation of the AI (not the training stage, which is where the energy listed above is used):AI industry, obsessed with speed, is loathe to consider the energy cost in latest MLPerf benchmark | ZDNet

Design in general, and UX in particular, anticipate more and more “AI” and background computing. It’s also needed for the kind of extreme user tracking for new business models (e.g. Surveillance Capitalism). And, a move like Facebook’s pivot to the so-called “metaverse” is power-hungry in the extreme. The computational power (even if it is done in the cloud) for realistic VR is going to be orders of magnitude higher than the current web-based Internet.

It’s clear from these two examples that we are going to hit hard limits on this pretty soon, long before any technological “singularity” can manifest itself.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.