Electric nonsense
I recently read a story on a local TV station’s website that just made we want to cringe. The story was about “Phantom energy drain.” That’s kind of a TV way of saying that most of our electronic devices use small amounts of energy while supposedly turned off. That’s true, and there are some people who want to minimize that energy drain.
Back in the late 2000’s, actor Ed Begley, Jr. starred in two versions of a TV show in which he scoured his house to locate and disable all those little energy users. Things like the LEDs on audio and video equipment; the ones that are on when the device is off or in standby mode. Or the ones on devices that glow when they’re fully charged — or charging. Microscopically tiny current drains that are essentially meaningless in reality.
Once Begley discovered these little annoyances, he’d find ways to disable them. Eventually, there was one episode in which he added a switch that turned off all the power to the entire house, thus shutting down everything. I can’t imagine doing something so extreme simply because I’d need to reset all the clocks in the house — except for a couple that have built-in battery backups.
The real problem
I’ve written about why I run my computers 24×7 before, but the short version is that turning things off and on actually shortens their little electronic lives. They are not designed to be unplugged. Some devices, such as plasma screen TVs, use a bit of current to keep them from getting cold while off in order to help prevent breakdowns due to thermal shock when turned back on. Unplugging those devices, as suggested by the TV station’s article, actually harms them and causes them to fail sooner.
That’s important because, according to recent studies, it can take as much as three times as much energy to create those devices than they use in their entire working lifetime. Other studies cite numbers around double. So 67% to 75% of the energy used by a computer is consumed by its construction. The ratio will only get larger as our computers become more efficient.
Leaving electronic devices turned on, specifically computers, lengthens their lives and reduces the number of new ones that need to be built to replace those dead ones. And that prevents the waste of much of the energy used to create them if they were to fail early.
Solutions
There are some solutions to reducing the amount of energy used by many types of devices, including our computers. Simply design them to be more efficient.
Older computers use more energy than newer ones. Intel, AMD, Nvidia and other chip makers are working hard to reduce energy consumption of their chips which leads to improved overall efficiency for computers and other devices that use those chips.
I see that on my own network. I use uninterruptible power supplies (UPS), aka battery backup units, on almost all my devices. Except for laptops which already have built-in batteries so don’t need a UPS. Those UPS systems display the current load so I can tell how much power they’re using.
For example, my 20 year old Dell, which has a Pentium 4 CPU with 2 CPUs, uses about 85 Watts under minimal compute loads. One of my somewhat more recent systems, a home-built tower that’s about 6 years old with 8 cores (16 CPUs), consumes about 31 Watts. An even newer system that’s 3 years old, has a 12-core processor with 20 CPUs and uses 27 Watts under similar load conditions.
Another solution is to make the tool chain of factories that create our computers more efficient. That includes everything from the mines that produce the raw materials, to the factories that assemble the components into the final product.
So when you see stories about unplugging devices to “save energy,” that really is a narrow view of what it means to save energy. The complete view includes the entire lifetime of a device, including how much energy it takes to create in the first place, and how efficient it is.