Deepfake video technology is the next big thing in fake news

Deepfake video technology is the next big thing in fake news

Fake news is one of the scourges of our increasingly connected world. With billions having instant access to social media sites and the web, misinformation can spread quickly like wildfire. To discern what’s bogus from real, we typically rely on concrete proof, like a video, for example, to prove that something was actually said or done.

But thanks to the unstoppable march of technology, even videos are now at risk of being faked convincingly. With emerging and terrifyingly advanced face tracking and video manipulation methods, a new era of disinformation is looming.

From a politician saying words that weren’t spoken to a celebrity doing things that weren’t done, the threat of these ultra-realistic fake videos, now collectively known as Deepfakes, is something that can no longer be denied.

If we’re not careful, the next fabricated video scandal that can threaten our national security or sway public opinion is just waiting around the corner.

Listen to this free Komando Consumer Tech Update and learn how fake videos just got real.

What are Deepfake videos?

Deepfake technology is an emerging technique that uses facial mapping, artificial intelligence and deep machine learning to create ultra-realistic fake videos of people saying and doing things that they haven’t actually done.

And the scary part? The technology is improving at such a rapid pace that it’s getting increasingly difficult to tell what’s fake.

Now, with the use of deep learning, all it takes are computer scans of multiple images and videos of a certain person. Deepfake software will then process this information and mimic the target’s voice, facial expressions and even individual mannerisms. In time, without the proper equipment, these Deepfake videos will become indistinguishable from the real deal!

Click here to see how far Deepfake technology has come.

We can’t trust our eyes anymore

The mass accessibility of Deepfake software has many worrying implications that are hard to ignore, as well. Now, even your regular Joe can create realistic fake videos of anyone saying anything that they want them to say. With this technology in everyone’s hands, it will be increasingly confusing to filter out the truth from the lies.

And it’s not just misinformation that we need to worry about. Realistic Deepfake videos can also be used in blackmail attempts, phishing links and extortion scams.

According to cybersecurity lawyer Steven Teppler, “Deepfake videos provide even the most unsophisticated criminals with the tools to create (and with minimal effort) realistic, hard to detect (at least without deep forensic analysis) video recordings that can impersonate and fool anyone, including law enforcement.”

They could be used in extortion, implicate innocent people in crimes, and in civil court proceedings, these fraudulent videos can be used to carry out all kinds of fraudulent claims or defenses,” Teppler added.

Note: Steven Teppler will debut his very own Cyberlaw Now podcast soon on the KomandoCast Network. Stay tuned!

So far, cruder versions of Deepfake technology are already widely used in fake celebrity porn and comedy gags but based on its incredible improvements, it’s only a matter of time before we start seeing videos with more serious consequences.

In a Deepfake video future, what can we do?

As we enter this new era of video misinformation, perhaps our biggest weapon against Deepfakes is awareness.

If we start recognizing the fact that powerful video manipulation is now widely accessible and can be easily done by anyone, we should start being more critical and mindful of all the video content that we encounter every day.

Fortunately, the U.S. government is already knee-deep in developing technologies that can detect Deepfake videos. For example, the U.S. Defense Advanced Research Projects Agency is already two years into its four-year program to find methods to combat fake videos and images.

Hopefully, these fake video busting techniques will develop fast enough and keep up with the rapid advancements in Deepfake technology itself.

Tags: AI (artificial intelligence), computer, cybersecurity, Deepfake, deepfake videos, extortion, fake news, future, Images, Komando, Learning, matter, misinformation, network, phishing, Podcast, realistic, research, scams, security, social media, spread, Target, tech, technology, tracking, update, video, videos, YouTube