Deep Fakes on the Deep Web

By Madison Kacir 5 Min Read

If a picture is worth 1000 words, then a video must be worth much more. In fact, they are worth much more. Video footage is much more convincing than a simple picture. In today’s society, video content bombards us every day. Videos and their messages can be very influential on society as a whole. What happens when people can’t trust video footage of their leader speaking to them? What happens when we can no longer trust the one form of communication most widely used in today’s society? This is a real possibility with the refinement of a new type of computer software that allows people to alter video footage, transpose new faces on to people, and change what they have said.

Fake Videos are Surfacing at an Alarming Rate

“Deepfakes” are the newest tech threat we face. This artificial intelligence tool makes it incredibly easy to alter someone’s face and speech in any video. What’s more is that, to the naked eye, there are very few clues indicating if a video is fake or not. Additionally, if you have a computer, these fake videos can be made easily. Most amateurs use a computer program called FakeApp to alter videos. Believe it or not, you are already familiar with the basics of this video altering software. Snapchat’s funny face filters use a rudimentary version of this software. These advancements in face-morphing technology are cutting edge. Just a few years ago creating a believable computer altered video took a considerable amount of time and money. Now, it’s available to anyone with a computer.

What are Deep Fake Videos?

Deep Fake videos use a machine learning tool to transpose faces. With this software, someone can “say” something they didn’t. If you have access to a computer you can make deep fakes easily. The basic concept of creating a fake video is simple. For example, let’s say you want to create a video of Person A, but with Person B’s face. First, you must gather thousands of pictures of both people. Next, you’ll need an encoder and a decoder to deconstruct and then reconstruct the new image. The encoder will retract the facial features of Person B to transpose it on to Person A.

The encoder collects the facial features, while the decoder recreates the new image. Each person will get their own encoder and decoder for their face. Next, you have to make sure that the decoders and encoders will match up during the output. The video is processed frame by frame to transpose the face of Person B on to Person A. The face of Person B is extracted and fed into the encoder of Person A. In this way, Person A is shown with the facial features of Person B.

Just how much of a Threat Does this Software Pose?

While it is true that people could use this software to play a prank, there are more serious things it can be used for as well. Much like fake news, deep fakes can deceive you. While it is true that there are some deep fake videos on the internet that are harmless pranks, more often than not these videos are malicious. The public’s opinion may be swayed. Originally, this software was used to play pranks or make illicit videos online. It first gained popularity on the online forum site known as Reddit. However, recently individuals have used this software for more harmful means. Fake videos have posed such a threat that they are catching the attention of the media, local police, and even the FBI.

In conclusion, face morphing software poses a large threat to society. Professionals have developed ways to spot fakes, but once a video is out there, it’s hard to take back what you put out on the internet. People need to exercise extreme caution while viewing videos on the internet because what they are viewing could in act be fabricated. It’s always best to check facts from several sources and make sure that the sources are credible. Police forces across the nation are working to combat this new and pressing issue. Fake news articles have progressed into fake videos. If you believe something could be fabricated, it would be a good idea to cross check the facts.

Share This Article