user-photo
Jean B

Author Bio.

Deep-Fake Videos as a Political Weapon

Remember when the computer image handling software Photoshop was new and people joked around by cutting a celebrity’s image from a source picture and superimposing it over another person’s image – usually in an amusing or compromising pose? This practice has become so common that the product name has become a verb, as in, “That image of the Queen Mother in a wrestling ring must be PhotoShopped.”

You Might Like
Learn more about RevenueStripe...

Graphics handling tools have come a long way since software leader Adobe Systems introduced their powerful product in 1988. One lesson learned was that seeing is not necessarily believing. Yet image editing, when done well, can be hard for the unsuspecting viewer to detect.

Most people assume that information posted online is true and correct – but this assumption is neither.

To further muddy the waters of Reality versus Fiction, now there is video software that can superimpose one image onto another – and it is rapidly becoming more sophisticated.

Termed deepfake, from a combination of “deep learning” and “fake,” it is “an artificial intelligence-based human image synthesis technique” that can “combine and superimpose existing images and videos onto source images or videos.”

Deep Learning algorithms are the software tool that allows savvy tech users to edit and manipulate digital video content. Snapchat has a popular Face Swap feature that lets users switch faces with one another in live videos. Hollywood filmmakers can use deep learning algorithms to include actors who are dead in their movies.

In the true spirit of scientific discovery, this exciting new technology is also commonly applied to generate…pornography? Oh yes.

The Digital Ethics Lab explains how easy it is to make deep-fake pornography:

“Due to recent technological developments, even those with modest coding experience are now able to manipulate pornographic videos, so they appear to feature non-pornographic celebrities, or even a user’s friends or acquaintances.”

Armed with either a lot of photos or only a few minutes of video content of the target person’s face, a prankster can produce a very realistic fake video, featuring the target person, in short order.

Consider this Facebook video which puts the animated heads of five world leaders as defenders on a soccer (British football) team against a kicker with the talking head of U.S. President Donald Trump. Although the quality of this video is so bad that we can all tell it is a fake intended to be humorous (or embarrassing), other deep-fake videos are much higher quality.

Check out Deep Video Portraits as an example of where deep-fake video technology is heading. The high quality of these experimental moving image samples makes it hard to identify them as fictional – but they are.

Michael Zollhöfer, Visiting Assistant Professor at Stanford University’s Department of Computer Science

Computer Graphics Laboratory writes in the Deep Video Portraits project Abstract:

“We present a novel approach that enables photo-realistic re-animation of portrait videos using only an input video. In contrast to existing approaches that are restricted to manipulations of facial expressions only, we are the first to transfer the full 3D head position, head rotation, face expression, eye gaze, and eye blinking from a source actor to a portrait video of a target actor.”

The Verge reveals that social media platforms like Reddit, Discord, Gyfcat, Pornhub, and Twitter “have already made their anti-face-swap porn policy clear.”

Earlier this year, Reddit out-and-out banned their face-swap “subreddit” porn community – which was approaching 100,000 users at the time! Deemed “involuntary pornography, Reddit now prohibits “depictions that have been faked” and “explicit content or soliciting ‘lookalike’ pornography.”

The law regarding deep-fake pornography is gray. At stake are underlying issues of both legality and morality. Wired points out that:

“In case after case, the First Amendment has protected spoofs and caricatures and parodies and satire.”

Celebrity victims of deep-fake pornography can “sue for the misappropriation of their images,” but for Just Plain Folks “your best hope is anti-defamation law.”

The deep-fake video phenomenon is offending so many people (especially women) that a cry for new laws to protect innocent citizens from being targeted this way, without their prior knowledge or consent, is being considered in several countries, including the United Kingdom and the United States.


Durham University law professor Claire McGlynn sees a big problem ahead that needs proactive legislation to protect individuals’ rights and reputations:

“With the digital age and more and more platforms coming out, people always find a way to weaponize them.”

McGlynn is of the opinion that “It’s not actually that difficult to draft a law which covers all forms of image-based sexual abuse.”

Observers are now speculating that deep-fake technologies, which are already being created to push a political point of view, might be used to sway the next U.S. midterm elections in the fall of this year (2018).

Artificial intelligence insiders are, in fact, placing bets on “whether or not someone will create a so-called Deep-fake video about a political candidate that receives more than 2 million views before getting debunked by the end of 2018.”

The tongue-in-cheek gamble doesn’t involve much risk for the players:

“Manhattan cocktails as a reward for the ‘yes’ camp and tropical tiki drinks for the ‘no’ camp.”

Although this particular bet could be considered just another form of amusement, it seems to be only a matter of “when” – and not “if” – the world of social media will grow murkier and more dangerous for Truth Seekers everywhere, as deep-fake videos become completely realistic.


2 Comments
  1. Post Author

    This is an important report — and most timely. Thank you!

  2. Post Author

    This article is extremely important, however the number of people that will take the time to read it and understand that what one sees is not necessarily realityIs likely very small. We should all take care to research ideas or positions held by ”politicos” for example, and not necessarily rely on what we see in photos and videos.

Leave a Reply