This Hole Looks Deep

Law Professor Chesney has a new phrase and an urgent warning about our shared computer future – it’s gonna suck. But didn’t we get this warning before? Here he is on Lawfare:

“We are truly fucked.” That was Motherboard’s spot-on  to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg. As Julian Sanchez , “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy. But my first thought is that we have not even scratched the surface of how bad ‘fake news’ is going to get.” Indeed.

Recent events amply demonstrate that false claims—even preposterous ones—can be peddled with unprecedented success today thanks to a combination of social media ubiquity and virality, cognitive biases, filter bubbles, and group polarization. The resulting harms are significant for , and . Belated recognition of the problem has spurred a variety of efforts to address this most recent illustration of truth decay, and at first blush there seems to be reason for optimism. Alas, the problem may soon take a significant turn for the worse thanks to deep fakes.

Get used to hearing that phrase. It refers to digital manipulation of sound, images, or video to impersonate someone or make it appear that a person did something—and to do so in a manner that is increasingly realistic, to the point that the unaided observer cannot detect the fake. Think of it as a destructive variation of the Turing test: imitation designed to mislead and deceive rather than to emulate and iterate.

Deep fakes. An example:

Fueled by artificial intelligence, digital impersonation is on the rise. Machine-learning algorithms (often neural networks) combined with facial-mapping software enable the cheap and easy fabrication of content that hijacks one’s identity—voice, face, body. Deep fake technology  individuals’ faces into videos without their permission. The result is “believable videos of people doing and saying things they never did.”

Maybe we’ll soon be seeing film of Ted Cruz father assassinating JFK – like candidate-Trump once claimed. Or, for that matter, Obama assassinating JFK – keeping in mind he was all of 2 years old.

Or maybe it’ll be your face captured by security cameras during that midnight bank robbery. You end up in jail for five years.

Removing the technology seems unlikely – the cat is out of the bag and sufficient computing power and necessary algorithms are already available. Unless we’re willing to give up computers, or at least this kind of processing, which I think unlikely in the extreme, we’re facing a new sort of society, one in which constant tracking and recording may be necessary simply to protect one’s privacy – one of those apparent contradictions fraught with peril.

Chesney seems to indicate there are no technologies available for detecting this sort of fraudulent behavior:

Unfortunately, it is not clear that the defense is keeping pace for now. An arms race to fortify the technology is on, but Dartmouth professor Hany Farid, the pioneer of PhotoDNA (a technology that identifies and blocks child pornography), : “We’re decades away from having forensic technology that … [could] conclusively tell a real from a fake. If you really want to fool the system you will start building into the deepfake ways to break the forensic system.” This suggests the need for an increase—perhaps a vast increase—in the resources being devoted to the development of such technologies.

The only thought I’d have on the subject is that we need to detect a change to a recording between its initial creation and the viewing of it. Quantum encrypted communications depends on quantum entanglement to detect when communications has been compromised. I don’t imagine it’s really practical, but if an initial recording could be linked to something such that the disturbance of the recording broke the link, that might make it possible to detect forgeries. Perhaps some clever physicist could push that thought along.

And that previous warning? Wag The Dog (1997), where we actually see how the public could be manipulated using computer-generated images.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.