If you thought fake news was bad, then you won’t believe what’s coming our way in the future. Widely seen as the next generation of hoaxes, Deepfakes are videos that use artificial intelligence to edit faces, motion, and speech with scary accuracy. In a Deepfake video, you can make any politician say anything you want, or swap the face of a person you know on a salacious picture for instant blackmail. The worst part: it’s getting harder to tell them apart from the real thing. This video might disturb you.
Before you panic, it might be an AI voice on the phone and not a relative in danger
Who can you trust in our AI-driven world of deepfakes and automatically-generated messages? “Deepfake” is short for “deep learning and fake,” and some of these projects feel so real that even experts might be fooled.
One scam on our radar takes the word “deceptive” to a new level. Instead of receiving a call from a human phone scammer, criminals are mimicking the voices of your loved ones, demanding financial assistance.
Keep reading to discover how thieves incorporate artificial intelligence (AI) into their scams.
Phone scammers using AI technology
One senior couple in Saskatchewan received an alarming call from what sounded like their grandson, Brandon. As reported by the Washington Post, the voice on the phone was asking desperately for bail money. According to the call, Brandon was in jail and needed help quickly.
The two scrambled to gather the funds from their savings and almost made good on their promise. Thankfully, a manager at their local bank caught wind of the story. He warned them that scammers were making the rounds and tricking innocent people into sending cash.
The calls weren’t just clever, generic impersonations of loved ones like Brandon. While phone-based imposter scams of the past did make use of professional voice actors to convince victims that their friends or family were in trouble, this new wave of fake calls was created using artificial intelligence. How?
If you’re familiar with the deepfake technology used to mimic celebrities, this tale will feel hauntingly familiar. With only seconds of a target’s recorded voice, these criminals can leverage machine learning to craft conversations about anything. The common denominator is the deepfake is always asking for money.
Schemes like this are challenging enough for professional analysts to detect when there’s a visual element involved. Without a video, the task becomes nearly impossible.
The technology has been used in scams like this since as early as 2019. Since then, threat actors have stolen millions of dollars from corporate ventures and ordinary people like you.
Once the scammer has enough information about your social circle, they can choose from a catalog of AI voices and tailor them to the recorded audio they’re referencing. You might be surprised by how close they’re able to get. Often, there is no way to recoup your financial losses after sending the money.
How to tell when an AI-generated call is fake
Deepfake bot 'undresses' 100K women with fake nude photos on messaging app
Deepfakes have been a topic of controversy ever since they first emerged on the web. To make a deepfake, creators will take ordinary photos and digitally stitch them to videos with the help of machine-learning AI. The result: an uncanny clip of someone who was never filmed.
Deepfake videos are getting terrifyingly real
I'm concerned about the election
Will deepfakes sway the next Presidential election? Get my take in this 60-second podcast.
Don't believe your ears. Is voice cloning the future for clever scammers?
There are already enough concerns in life out there to keep people with anxiety up at night. Think things can’t get any worse? Well, you’re wrong. New voice impersonation technology could be a game-changer for scammers.
Facebook is banning 'deepfake' videos created to fool you
We’ve been hearing a lot about the spreading of fake news lately — especially on social media sites during election years like this one.
That’s why many of us look for verification when someone is said to have done something outrageous. We turn to proof like video or audio recordings. The problem is technology has advanced to the point where videos can be faked.