{"id":30066,"date":"2020-08-10T08:17:10","date_gmt":"2020-08-10T13:17:10","guid":{"rendered":"http:\/\/huewhite.com\/umb\/?p=30066"},"modified":"2020-08-10T08:17:10","modified_gmt":"2020-08-10T13:17:10","slug":"even-the-algorithms","status":"publish","type":"post","link":"https:\/\/huewhite.com\/umb\/2020\/08\/10\/even-the-algorithms\/","title":{"rendered":"Even The Algorithms"},"content":{"rendered":"<p>You&#8217;ve probably heard about this, but I&#8217;ll mention it anyways, <a href=\"https:\/\/www.newscientist.com\/article\/2246202-uber-and-lyft-pricing-algorithms-charge-more-in-non-white-areas\/\" target=\"_blank\" rel=\"noopener noreferrer\">from<\/a> <strong><em>NewScientist<\/em><\/strong>:<\/p>\n<blockquote><p>The algorithms that ride-hailing companies, such as Uber and Lyft, use to determine fares appear to create a\u00a0<a href=\"http:\/\/newscientist.com\/article\/2219284-uk-launched-passport-photo-checker-it-knew-would-fail-with-dark-skin\/\">racial bias<\/a>.<\/p>\n<p>By analysing transport and census data in Chicago, Aylin Caliskan and Akshat Pandey at The George Washington University in Washington DC have found that ride-hailing companies charge a higher price per mile for a trip if the pick-up point or destination is a neighbourhood with a higher proportion of\u00a0<a href=\"http:\/\/newscientist.com\/article\/2161028-face-recognition-software-is-perfect-if-youre-a-white-man\/\">ethnic minority<\/a>\u00a0residents than for those with predominantly white residents.<\/p>\n<p>\u201cBasically, if you\u2019re going to a neighbourhood where there\u2019s a large African-American population, you\u2019re going to pay a higher fare price for your ride,\u201d says Caliskan.<\/p><\/blockquote>\n<p><em><strong>Uber<\/strong><\/em> &amp; <em><strong>Lyft<\/strong><\/em> are not happy:<\/p>\n<blockquote><p>\u201cWe recognise that systemic biases are deeply rooted in society, and appreciate studies like this that look to understand where technology can unintentionally discriminate,\u201d said a Lyft spokesperson. \u201cThere are many factors that go into pricing \u2013 time of day, trip purposes, and more \u2013 and it doesn\u2019t appear that this study takes these into account. We are eager to review the full results when they are published to help us continue to prioritise equity in our technology.\u201d<\/p>\n<p><span lang=\"en-AU\">\u201cUber does not condone discrimination on our platform in any form, whether through algorithms or decisions made by our users,\u201d said an Uber spokesperson<\/span><span lang=\"en-AU\">. \u201cWe commend studies that try to better understand the impact of dynamic pricing so as to better serve communities more equitably. It\u2019s important not to equate correlation for causation and there may be a number of relevant factors that weren\u2019t taken into account for this analysis, such as correlations with land-use\/neighborhood patterns, trip purposes, time of day, and other effects.\u201d<\/span><\/p><\/blockquote>\n<p>I wonder if we&#8217;ll be seeing their proprietary algorithms and databases, of which the latter may be more important than the algorithms, be stripped of their protected status. Or perhaps the courts will be appointing &#8220;special masters&#8221; to study the systems and determine why they&#8217;re discriminatory.<\/p>\n<p>And then see the companies ordered to &#8220;make them right.&#8221;<\/p>\n<p>And then the see the companies to cheat on the test, much like Volkswagen did a couple of years back on efficiency tests.<\/p>\n<p>Or am I too cynical?<\/p>\n","protected":false},"excerpt":{"rendered":"<p>You&#8217;ve probably heard about this, but I&#8217;ll mention it anyways, from NewScientist: The algorithms that ride-hailing companies, such as Uber and Lyft, use to determine fares appear to create a\u00a0racial bias. By analysing transport and census data in Chicago, Aylin Caliskan and Akshat Pandey at The George Washington University in \u2026 <a class=\"continue-reading-link\" href=\"https:\/\/huewhite.com\/umb\/2020\/08\/10\/even-the-algorithms\/\"> Continue reading <span class=\"meta-nav\">&rarr; <\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-30066","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/30066","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/comments?post=30066"}],"version-history":[{"count":2,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/30066\/revisions"}],"predecessor-version":[{"id":30073,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/posts\/30066\/revisions\/30073"}],"wp:attachment":[{"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/media?parent=30066"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/categories?post=30066"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/huewhite.com\/umb\/wp-json\/wp\/v2\/tags?post=30066"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}