As a software engineer, if there’s one thing I don’t worry about in my finished product, it’s how much energy it’ll use calculating the final result. That is, I take my electricity for granted, as well as my customers’.
And it’s funny, because I’ve been made aware of the fact that our calculations are becoming more and more involved, although not in my area (I’m fairly mundane as far as programmers go). Two examples are climate forecasting and crypto-currency calculations, where both are consuming so much power that it’s becoming a concern for the future.
But once again I’m surprised at the cost of computing, in a schadenfreude sort of way, in this fascinating report by Michael Le Page in NewScientist (13 October 2018, paywall):
It isn’t widely appreciated how incredibly energy hungry AI is. If you ran AlphaGo non-stop for a year, the electricity alone would cost about £6000. That doesn’t matter for one-off events, like an epic Go showdown. But it does matter if billions of people want their smartphones to be truly smart, or have their cars drive themselves.
Many potential uses of AI simply won’t become widespread if they require too much energy. On the flip side, if the uses are so desirable or profitable that people don’t care about the costs, it could lead to a surge in electricity consumption and make it even harder to limit further warming of the planet.
AI consumes so much energy because the technique behind these recent breakthroughs, deep learning, involves performing ever more computations on ever more data. “The models are getting deeper and getting wider, and they use more and more energy to compute,” says Max Welling of the University of Amsterdam, the Netherlands. …
Take self-driving cars. These require all sorts of extra systems, from cameras to radar, that use power and also increase weight and drag, further increasing energy use. But the single largest consumer of energy besides the engine is the processor running the AI.
According to a study out earlier this year, self-driving cars could use up to 20 per cent more energy than conventional cars. That is a big issue for a battery-powered car, limiting its range and increasing running costs. What’s more, the study assumes the AI processor consumes about 200 watts, even though current prototypes consume in excess of 2000 watts.
For taxi companies using AI to directly replace human drivers, the savings in wages would probably far outweigh the higher energy costs. But for ordinary car owners this would be a major issue.
Wow! It brings up a host of questions, doesn’t it? Of course, Le Page notes the industry is frenziedly trying to get around this energy consumption problem through such tricks as reducing precision where it’s not necessary and the invention of dedicated hardware, analogous to GPUs (Graphical Processor Units). I note, purely because I can say something vaguely relevant, this:
There are more revolutionary designs in the works, too. Shunting data back and forth between the memory and processor wastes a great of deal of energy, says [Avishek Biswas of the Massachusetts Institute of Technology]. So he has developed a chip intended for smartphones that slashes energy use by around 95 per cent by carrying out key operations in the memory itself.
Which triggers a memory in me. Back when I was learning Mythryl and the functional programming paradigm (which is not related to C functions, but rather to the notion of functions in mathematics, meaning the same inputs to a Mythryl function always results in the same outputs, and there are No Side Effects), the late Jeff Prothero (aka Cynbe ru Taren), chief programmer, administrator, flunky, and janitor on the Mythryl project, mentioned to me that thread programming in Mythryl should be, because of the way data is naturally handled in functional programming languages, far, far more efficient than in other computing languages. That’s because it’s all about data labels rather than data variables, so there is no copying global data to and from processors as the data changed. Because it didn’t.
I see no mention in the article about actually using better computing languages. An oversight by the author? Or by the industry?
Anyways, back to my thoughts on the article, as the above was nothing more than a self-important digression on my part. As the article notes, human brains still far out-perform computers per joule consumed:
ARTIFICIAL intelligence breakthroughs have become a regular occurrence in recent years. One of the most impressive achievements so far was in 2016, when Google DeepMind’s AlphaGo AI beat champion Lee Sedol at one of the world’s most complex games, Go.
The feat made headlines around the world as an example of machines besting humans, but in some sense it wasn’t a fair fight. Sedol’s brain would have been consuming around 20 watts of power, with only a fraction of that being used for the game itself. By contrast, AlphaGo was using some 5000 watts.
This suggests one of two possible explanations. First, our artificial-intelligence designs suck. I don’t give a lot of credence to this conclusion, because it’s self-evident that humans come equipped with intelligence-specific hardware, isn’t it? The brain is specifically constructed, in some large fraction, to be intelligent. That it’s evolved rather than designed doesn’t matter; there are areas of the brain dedicated to intelligence. So perhaps, through clever hardware construction, we can build more energy-efficient AI.
But that does lead to an alternative, highly controversial, and not yet supported conclusion:
Only quantum computers can hope to be as efficient as human brains because human brains work using quantum effects.
Yeah, I’m not going to be providing proof for that one. I understand from some pop-sci articles (so take it as you will) there are some high-level scientists who are researching this possibility, and, given the abilities of a biological organ consuming only 20 watts, there is a certain inclination to wonder if this could be true.
But regardless of whether or not it’s true, it’s always hard to say self-driving cars are the future when you realize that there is a surplus of brains, housed in convenient transport modules and equipped with working limbs, that can just drive the bloody things themselves.
Self-driving cars, even if achieved, and contra Kevin Drum, may end up sitting next to the 3-D televisions in the discard bin at Best Buy. The spinoffs of that effort may be more interesting than the final product. We already know how to drive cars.