Deepfake videos can make anyone say or do anything

Imagine being able to make it look like any person in the world is doing or saying anything you want. With deepfakes , this is not only possible, but free and easy.

For those who don’t know, deepfakes are videos that use motion capture technology to impose someone’s face onto another person’s body or to impersonate another person entirely. A basic example of the technology can be seen below:

As you can see, the technology is incredibly realistic. With a skilled vocal impersonator, it can be very difficult to tell a deepfake from an actual video.

Deepfakes aren’t the result of some sort of top-secret CIA technology, anyone can download an app and make them without much hassle. Deepfakes are created by machine learning algorithms that manipulate video footage until the “realistic” algorithm can reliably be fooled. The technology works best when the algorithms are fed tons of footage, which explains why presidents, actors, and other public figures are the most frequent subjects of deepfakes.

There are several ramifications of this technology and nearly all of them are scary. Before we dive into all that, here’s the only harmless example of deepfaking that we’ve seen: Nicolas Cage being inserted into movies he never starred in.

And here’s Steve Buscemi’s face mapped over Jennifer Lawrence for some terrifying reason.

Despite how great those videos are, the vast majority of deepfakes are created for far more nefarious purposes. Currently, the most common use of deepfakes by far is to put the faces of popular celebrity women onto pornstars.

Numerous victimized women have spoken out against this gross trend, including Scarlett Johansson, who said, “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired. The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause … the internet is a vast wormhole of darkness that eats itself.”

It’s not just celebrity women either. An increasing number of people have fallen victim to deepfake revenge porn from vengeful exes. Because the technology is so new, there’s no concrete legislation to protect people from being deepfaked into porn. Much discussion has been had as to whether the creators of these videos could be charged as identity theft, harassment, or cyberstalking. At the very least, nearly every major porn site bans the uploading of deepfake videos. Mainstream websites like Twitter, Reddit, and Gyfcat have also banned the posting of deepfake videos.

10 Tricks for spotting fake news

In addition to sexual harassment, deepfakes could also have a devastating effect on politics. Deepfakes could be used to make powerful people say alarming or sensational things. It also works in reverse, as politicians have plausible deniability if they get caught saying or doing something embarrassing (for example, if deepfakes were around in 2016 President Trump could have claimed the infamous “grab them by the ****” video as a deepfake). The last U.S. presidential election already had enough fake news floating around, so imagine the chaos and misinformation that would be caused by widespread deepfake use.

While national defense organizations like the CIA are at work developing technology that can distinguish deepfakes, the moral of the story is to use research and critical thinking whenever you see a video that seems suspicious. Deepfakes are the newest addition to the post-truth/fake news era, and it is more important than ever that people carefully consider where their sources are coming from.

Leave A Comment