This Hole Looks Deep, Ctd

Remember deepfakes, the anticipated production of video fakes that are difficult or impossible to detect? NewScientist (1 September 2018, paywall) reports they’re here:

A FAKE is only as good as it looks. And while forging a counterfeit handbag or watch takes time and effort, churning out fake videos has become surprisingly easy.

A new system can turn a few simple animated line drawings into realistic fake clips in high definition. The software is open source, meaning it is available to anyone – and it has reignited concerns that such tools could be used to warp our perception of the world. …

The resulting footage can be produced at 2K resolution and looks startlingly lifelike. Examples the team has produced include street scenes, and people talking or performing dance moves (arxiv.org/abs/1808.06601).

“It is sort of stunning, the progress that has been made,” says digital forensics expert Hany Farid at Dartmouth College in New Hampshire.

This type of video has become known as a deepfake, and fake videos of world leaders, such as Donald Trump and Theresa May, have been created using similar techniques. A community dedicated to creating faked pornography videos containing famous actors has sprung up too.

In case you don’t think this is a big deal, even prior to the development of this system, people have died because of the distribution of fake videos:

Fake videos have been implicated in the deaths of more than 20 people in India. This started after a video clip showing two men on a motorcycle snatching children on the streets went viral on WhatsApp.

The video was originally a public service announcement in Pakistan to raise awareness of human trafficking. However, it was edited to remove the message at the end. The clip was thought real, and widely spread WhatsApp messages pointed the finger at organ thieves disguised as beggars, which sparked public outrage leading to mob killings.

Essentially, assassinations, metaphorical and real, can now be arranged by the malign simply through distribution of a video of something that never happened, because they can depend on the mob mentality to complete the job.

That will be true until society decides in a collective manner, no doubt only enough people are dead or ruined, to no longer trust a video. Electronic recordings of the visual aspect of reality are now transitioning from somewhat trustworthy to not trustworthy at all.

Impacts? I count the following:

  • Courts will try to accept only those recordings for which the provenance is known and trusted, which they do to some extent already, but I suspect this will grudgingly be lost as more and more courts are fooled by technologies such as this.
  • The continued growth of an art form in which real people are placed in fictional situations. This is already happening, but as more and more artists become involved, it will evolve into who-knows-what.
  • A drop in the sales of real cameras as current and potential customers become disgusted by the entire phenomenon of recording reality.
  • The use of this technology to question the very authenticity of someone’s identity through the production of suicide videos depicting the death of people who have not died. Regarded as nuisance crime, at its most basic it’s an assassination of someone’s life, similar to today’s identity theft. The addition of difficult-to-identify bodies which may correspond in some way to the victim of this crime will make the situation especially difficult – and spawn an industry in which people are actually murdered in order to supply bodies for the virtually murdered.

Will we turn completely away from recordings of the visual aspect of reality as a society? Or will we find that technological solution in order to save this world-wide custom? I look forward to finding out.

WaPo’s Thomas Kent has a few thoughts on the subject, wrapping up with this:

Unless the dangers of fake video receive broad public attention now, the public will be caught unaware when truly convincing fakes appear, perhaps with disastrous results.

Finally, in publicizing the dangers, media need to avoid a tone of hopelessness — “Soon we may never know what is real and what isn’t.” Quality media outlets need to emphasize how carefully they vet video. They should make sure their ethics codes and verification procedures adequately address the dangers. Otherwise, audiences will doubt any video — including legitimate and important footage that media outlets gather in their own breaking-news coverage and investigative work.

Kent still has hope to salvage the video recording, but he does acknowledge the possibility of it being completely lost.

And just for your delectation:

It’s gonna get worse, folks.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.