{"id":18055,"date":"2018-10-22T06:57:15","date_gmt":"2018-10-22T11:57:15","guid":{"rendered":"http:\/\/huewhite.com\/umb\/?p=18055"},"modified":"2018-10-22T06:57:15","modified_gmt":"2018-10-22T11:57:15","slug":"its-sucking-down-what","status":"publish","type":"post","link":"https:\/\/huewhite.com\/umb\/2018\/10\/22\/its-sucking-down-what\/","title":{"rendered":"It&#8217;s Sucking Down What?"},"content":{"rendered":"<p>As a software engineer, if there&#8217;s one thing I don&#8217;t worry about in my finished product, it&#8217;s <em>how much energy it&#8217;ll use calculating the final result<\/em>. That is, I take my electricity for granted, as well as my customers&#8217;.<\/p>\n<p>And it&#8217;s funny, because I&#8217;ve been made aware of the fact that our calculations are becoming more and more involved, although not in my area (I&#8217;m fairly mundane as far as programmers go). Two examples are <a href=\"https:\/\/huewhite.com\/umb\/2016\/04\/16\/engaging-hard-problems\/\" target=\"_blank\" rel=\"noopener\">climate forecasting<\/a> and <a href=\"https:\/\/huewhite.com\/umb\/2018\/06\/29\/currency-always-has-costs-ctd-7\/\" target=\"_blank\" rel=\"noopener\">crypto-currency calculations<\/a>, where both are consuming so much power that it&#8217;s becoming a concern for the future.<\/p>\n<p>But once again I&#8217;m surprised at the cost of computing, in a <em>schadenfreude<\/em> sort of way, in this fascinating <a href=\"https:\/\/www.newscientist.com\/article\/mg24031992-100-ais-dirty-secret-energy-guzzling-machines-may-fuel-global-warming\/\" target=\"_blank\" rel=\"noopener\">report<\/a> by Michael Le Page in <em><strong>NewScientist<\/strong><\/em> (13 October 2018, paywall):<\/p>\n<blockquote><p>It isn\u2019t widely appreciated how incredibly energy hungry AI is. If you ran AlphaGo non-stop for a year, the electricity alone would cost about \u00a36000. That doesn\u2019t matter for one-off events, like an epic Go showdown. But it does matter if billions of people want their smartphones to be truly smart, or have their cars drive themselves.<\/p>\n<p>Many potential uses of AI simply won\u2019t become widespread if they require too much energy. On the flip side, if the uses are so desirable or profitable that people don\u2019t care about the costs, it could lead to a surge in electricity consumption and make it even harder to limit further warming of the planet.<\/p>\n<p>AI consumes so much energy because the technique behind these recent breakthroughs, deep learning, involves performing ever more computations on ever more data. \u201cThe models are getting deeper and getting wider, and they use more and more energy to compute,\u201d says Max Welling of the University of Amsterdam, the Netherlands. &#8230;<\/p>\n<p>Take self-driving cars. These require all sorts of extra systems, from cameras to radar, that use power and also increase weight and drag, further increasing energy use. But the single largest consumer of energy besides the engine is the processor running the AI.<\/p>\n<p>According to a study out earlier this year,\u00a0<a href=\"https:\/\/doi.org\/10.1021\/acs.est.7b04576\">self-driving cars<\/a>\u00a0could use up to 20 per cent more energy than conventional cars. That is a big issue for a battery-powered car, limiting its range and increasing running costs. What\u2019s more, the study assumes the AI processor consumes about 200 watts, even though current prototypes consume in excess of 2000 watts.<\/p>\n<p>For taxi companies using AI to\u00a0<a href=\"https:\/\/www.newscientist.com\/article\/mg22630151-700-ai-interns-software-already-taking-jobs-from-humans\/\">directly replace human drivers<\/a>, the savings in wages would probably far outweigh the higher energy costs. But for ordinary car owners this would be a major issue.<\/p><\/blockquote>\n<p>Wow! It brings up a host of questions, doesn&#8217;t it? Of course, Le Page notes the industry is frenziedly trying to get around this energy consumption problem through such tricks as reducing precision where it&#8217;s not necessary and the invention of dedicated hardware, analogous to GPUs (Graphical Processor Units). I note, purely because I can say something vaguely relevant, this:<\/p>\n<blockquote><p>There are more revolutionary designs in the works, too. Shunting data back and forth between the memory and processor wastes a great of deal of energy, says [Avishek Biswas of the Massachusetts Institute of Technology]. So he has developed a chip intended for smartphones that\u00a0<a href=\"http:\/\/news.mit.edu\/2018\/chip-neural-networks-battery-powered-devices-0214\">slashes energy<\/a>\u00a0use by around 95 per cent by carrying out key operations in the memory itself.<\/p><\/blockquote>\n<p>Which triggers a memory in me. Back when I was learning <a href=\"https:\/\/mythryl.org\" target=\"_blank\" rel=\"noopener\"><em><strong>Mythryl<\/strong><\/em><\/a> and the <a href=\"http:\/\/en.wikipedia.org\/wiki\/Functional_programming\" target=\"_blank\" rel=\"noopener\">functional programming paradigm<\/a> (which is not related to C functions, but rather to the notion of functions in mathematics, meaning the same inputs to a <em><strong>Mythryl<\/strong><\/em> function always results in the same outputs, and there are <strong>No Side Effects<\/strong>), the late Jeff Prothero (aka Cynbe ru Taren), chief programmer, administrator, flunky, and janitor on the <em><strong>Mythryl<\/strong><\/em> project, mentioned to me that thread programming in <em><strong>Mythryl<\/strong><\/em> should be, because of the way data is naturally handled in functional programming languages, far, far more efficient than in other computing languages. That&#8217;s because it&#8217;s all about <em>data labels<\/em> rather than <em>data variables<\/em>, so there is no copying global data to and from processors as the data changed. Because it didn&#8217;t.<\/p>\n<p>I see no mention in the article about actually using better computing languages. An oversight by the author? Or by the industry?<\/p>\n<p>Anyways, back to my thoughts on the article, as the above was nothing more than a self-important digression on my part. As the article notes, human brains still far out-perform computers per joule consumed:<\/p>\n<blockquote><p>ARTIFICIAL intelligence breakthroughs have become a regular occurrence in recent years. One of the most impressive achievements so far was in 2016, when Google DeepMind\u2019s AlphaGo AI beat champion Lee Sedol at one of the world\u2019s most complex games, Go.<\/p>\n<p>The feat made headlines around the world as an example of machines besting humans, but in some sense it wasn\u2019t a fair fight. Sedol\u2019s brain would have been consuming around 20 watts of power, with only a fraction of that being used for the game itself. By contrast, AlphaGo was using some 5000 watts.<\/p><\/blockquote>\n<p>This suggests one of two possible explanations. First, <em>our artificial-intelligence designs suck<\/em>. I don&#8217;t give a lot of credence to this conclusion, because it&#8217;s self-evident that humans come equipped with intelligence-specific hardware, isn&#8217;t it? The brain is specifically constructed, in some large fraction, to be intelligent. That it&#8217;s evolved rather than designed doesn&#8217;t matter; there are areas of the brain dedicated to intelligence. So perhaps, through clever hardware construction, we can build more energy-efficient AI.<\/p>\n<p>But that does lead to an alternative, highly controversial, and not yet supported conclusion:<\/p>\n<p><em>Only quantum computers can hope to be as efficient as human brains because human brains work using quantum effects.<\/em><\/p>\n<p>Yeah, I&#8217;m not going to be providing proof for that one. I understand from some pop-sci articles (so take it as you will) there are some high-level scientists who are researching this possibility, and, given the abilities of a biological organ consuming only 20 watts, there is a certain inclination to wonder if this could be true.<\/p>\n<p>But regardless of whether or not it&#8217;s true, it&#8217;s always hard to say <em>self-driving cars are the future<\/em> when you realize that there is a surplus of <em>brains<\/em>, housed in convenient transport modules and equipped with working limbs, that can just drive the bloody things themselves.<\/p>\n<p>Self-driving cars, even if achieved, and\u00a0<a href=\"https:\/\/www.motherjones.com\/kevin-drum\/2018\/10\/three-cheers-for-driverless-buses\/\" target=\"_blank\" rel=\"noopener\">contra Kevin Drum<\/a>, may end up sitting next to the 3-D televisions in the discard bin at <em><strong>Best Buy<\/strong><\/em>. The spinoffs of that effort may be more interesting than the final product. We already know how to drive cars.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>As a software engineer, if there&#8217;s one thing I don&#8217;t worry about in my finished product, it&#8217;s how much energy it&#8217;ll use calculating the final result. That is, I take my electricity for granted, as well as my customers&#8217;. And it&#8217;s funny, because I&#8217;ve been made aware of the fact \u2026 <a class=\"continue-reading-link\" href=\"https:\/\/huewhite.com\/umb\/2018\/10\/22\/its-sucking-down-what\/\"> Continue reading <span class=\"meta-nav\">&rarr; <\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-18055","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/18055","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/comments?post=18055"}],"version-history":[{"count":2,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/18055\/revisions"}],"predecessor-version":[{"id":18059,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/18055\/revisions\/18059"}],"wp:attachment":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/media?parent=18055"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/categories?post=18055"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/tags?post=18055"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}