{"id":2208,"date":"2015-09-22T22:53:07","date_gmt":"2015-09-23T03:53:07","guid":{"rendered":"http:\/\/huewhite.com\/umb\/?p=2208"},"modified":"2015-09-22T22:53:07","modified_gmt":"2015-09-23T03:53:07","slug":"the-future-of-smart-robots-ctd-4","status":"publish","type":"post","link":"https:\/\/huewhite.com\/umb\/2015\/09\/22\/the-future-of-smart-robots-ctd-4\/","title":{"rendered":"The Future of Smart Robots, Ctd"},"content":{"rendered":"<p>Anders Sandberg gets it.\u00a0 That is, that there are <a href=\"https:\/\/huewhite.com\/umb\/2015\/03\/19\/the-future-of-smart-robots\/\" target=\"_blank\">ethical questions<\/a> arising from the attempt to create an artificial intelligence.\u00a0 He <a href=\"https:\/\/www.newscientist.com\/article\/mg22730380-400-can-software-suffer-death-and-pain-in-digital-brains\/\" target=\"_blank\">writes<\/a> in <em><strong>NewScientist<\/strong><\/em> (12 September 2015, paywall):<\/p>\n<blockquote><p>It is the third problem that really interests me. <a href=\"http:\/\/www.tandfonline.com\/doi\/full\/10.1080\/0952813X.2014.895113#.Velih_lVhHw\">Would emulations feel pain?<\/a> Do we have to care for them like we do for animals or humans involved in medical research?<\/p><\/blockquote>\n<p>Exactly.\u00a0 If you achieve your goal &#8211; creating an artificial intelligence &#8211; then is it ethical to deactivate the program, turn off the hardware at the end of the day?\u00a0 Does the fact that we created that intelligence &#8211; depending on how you define <em>create<\/em>, as it&#8217;s very much a team enterprise &#8211; also give us the right to inflict pain upon and end the existence of the artificial intelligence?<\/p>\n<p>The answer may technically be <em>YES<\/em>, but it would be a measurement of our maturity and intelligence to realize causing anguish to a living, thinking being &#8211; one that may feel and think on our level &#8211; is a moral hazard.\u00a0 Anders agrees:<\/p>\n<blockquote><p>My suggestion is that it is better to be safe than sorry: assume that any emulated system could have the same mental properties as the organism or biological system it is based on, and treat it accordingly. If your simulation just produces neural noise, you have a good reason to assume there is nothing in there to care about. But if you make an emulated mouse that behaves like a real one, you should treat it like you would treat a lab mouse.<\/p><\/blockquote>\n<p>And then he continues onward to even more interesting questions, which may be unique:<\/p>\n<blockquote><p>What about euthanasia? Living organisms die permanently, and death means the loss of their only chance at being alive. But an emulated brain could be restored from a backup: Lab Rat 1.0 would awake in the same way no matter how many copies had been tested in the past. The only thing lost when restoring it would be the memories of the previous experiment. There may still be pleasures and pains that count. In some ethical views, running a million supremely happy rat simulations in the background might be a \u201cmoral offset\u201d for doing something painful to one.<\/p><\/blockquote>\n<p>Maybe.\u00a0\u00a0 But the awareness of the imminence of extinction of this copy of the AI, if it causes anguish, is this a problem?<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Anders Sandberg gets it.\u00a0 That is, that there are ethical questions arising from the attempt to create an artificial intelligence.\u00a0 He writes in NewScientist (12 September 2015, paywall): It is the third problem that really interests me. Would emulations feel pain? Do we have to care for them like we \u2026 <a class=\"continue-reading-link\" href=\"https:\/\/huewhite.com\/umb\/2015\/09\/22\/the-future-of-smart-robots-ctd-4\/\"> Continue reading <span class=\"meta-nav\">&rarr; <\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-2208","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/2208","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/comments?post=2208"}],"version-history":[{"count":4,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/2208\/revisions"}],"predecessor-version":[{"id":2212,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/2208\/revisions\/2212"}],"wp:attachment":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/media?parent=2208"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/categories?post=2208"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/tags?post=2208"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}