Mona Lisa And Nancy Pelosi: The Implications Of Deepfakes

An altered video of Nancy Pelosi that appeared last week brought deepfakes back into the limelight. But how can we defend against this technology as it improves so quickly? CBS NEWS

Mona Lisa smiles

In a five-minute YouTube video, the Moscow-based Samsung team runs through the different capabilities of their new facial mapping model. These include mapping someone’s face onto another video source using as little as one input image, from which it is possible to animate the faces of oil paintings with some success, “despite the large domain gap between paintings and YouTube videos,” as the video’s narrator stated. The algorithm works by first doing a huge amount of meta-learning using image banks that can be compared to a trusted model, such as a real video of that person. To map another face or create a realistic avatar of a person, an “embedder” network measures parameters like the size and location of eyes, nose and mouth from an input image and converts this into vector data. A separate “generator” network then maps the “facial landmarks” of the person in the target video to accurately capture their range of expression. Finally, a “discriminator” network positions the first vectors onto the facial landmarks in the target video so that the image’s features are superimposed over the video.

The Pelosi predicament

Disturbingly accurate deepfakes built by individuals are also appearing online. These are created by either training AI or combining AI and off-the-shelf video editing software. One YouTube channel, Ctrl Shift Face, has slowly been accruing views on its deepfake videos (3 million in total). In the first and most impressive clip, actor and comedian Bill Hader’s face morphs into Al Pacino and Arnold Schwarzenegger seamlessly as he makes vocal impressions of the actors. While the capabilities of official AI research projects are improving rapidly, there also seems to be a number of self-taught individuals who are developing their own techniques for entertainment (as with Ctrl Shift Face) or for more malicious purposes. This could be a worrying trend-as news stories become harder to verify, visual media can spread on social media sites with little official regulation or verification, and more sinister practices such as revenge porn can become more popular as a means of propagating blackmail, slander or shaming individuals.

Wider implications

However, perpetrators of deepfakes are incredibly difficult to track down. Using the free and unregulatable expanse of the internet, users share lessons, tips, videos and different versions of software. In doing so the various tools become better and the network of content becomes harder to navigate.

Devil in the details

Deepfake technology is clearly becoming incredibly sophisticated, evidenced by the impressive realism using just one input image by the Samsung researchers in Moscow, and the home-made technology that blends faces of actors using YouTube and film footage. The risk that this presents is not yet clear, but there have already been political, celebrity and civilian targets of false material being used to coerce, intimidate or harass.

Governments, regulators and individuals need to become more aware of the tell-tale signs of deepfake material, so that the most sophisticated material can be outed out as soon as possible.

Regulators must navigate a difficult legal landscape around free-speech and ownership laws to properly regulate the use of this technology before it can do significant damage. The wider usage of facial recognition technology must also be considered in light of these capabilities, so that the risks of collecting and distributing facial data are properly accounted for. It is impressive to see the pace at which deepfake technology has progressed, but as with any technology involving AI and personal data, a balance must be struck between open innovation and proper regulation.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Charles Towers-Clark

CEO of Pod Group (@PodGroup_IoT) Author of “The WEIRD CEO”. Advocate of Employee Self Responsibility, #FutureOfWork and #AI. Contributor: http://Forbes.com