A slew of Trump fakes are coming – How to spot them
Folks, before you even think about writing me a note saying, “You wrote this ‘cause you hate Trump” or “You wrote this ‘cause you love Trump,” don’t waste your time. I’m not talking politics today, and I’m not taking sides. I’m protecting you from falling for deepfakes and the absolute barrage of junk you can expect on social media in the coming days, weeks and months.
As you well know, former President Donald Trump was just found guilty on all counts in a hush-money trial. Just like any time there’s a massive news story, hackers and scammers are going to exploit it as much as possible. I’ve got the scoop on how to browse safely, regardless of your politics.
Fakes thrive on feeling
There’s a reason so many deepfakes go viral. They’re created to stoke your emotions (mad, sad, scared, outraged — you name it) and get you to hit “share.”
Almost all the AI-generated junk online is peddled for clicks on social media, not published by major news outlets. These publications still get tripped up, of course, but it’s rare. You need to be extra careful with anything posted by an account you’ve never heard of.
What kind of stuff are we talking about?
Following the Trump news, I’m expecting it all: Fake fundraisers, deepfake videos, apps that promise the inside scoop, crowdfunding that say all the proceeds or going to this or that political party, malware disguised as games or court proceedings PDFs, and more … a lot more.
Really, anything with the name “Trump” on it is ripe for malware and scam targets. Stick with news outlets and political sources you already use and trust.
How to spot deepfakes
Fakes of folks like Trump are particularly tricky to spot because there’s so much public footage of politicians speaking in front of similar backgrounds to copy. But you can still use these guidelines to verify if an image is AI or not:
- Backgrounds: A vaguely blurred background, smooth surfaces or lines that don’t match up are immediate red flags an image is AI-generated. Watch clothing patterns, too.
- Context: Use your head. If the scenery doesn’t align with the current climate, season or what’s physically possible, that’s because it’s fake.
- Proportions: Check for objects that look mushed together or seem too large or small. The same goes for features, especially ears, fingers and feet.
- Angle: Deepfakes are the most convincing when the subject is facing the camera directly. Once a person starts to turn to the side and move, glitches may appear.
- Text: AI can’t spell. Look for fake words on signs and labels.
- Chins: The lower half of the face is the No. 1 giveaway on AI-generated candidate videos. It’s subtle, but check to see if their chin or neck moves unnaturally or in an exaggerated way.
- Fingers and hands: Look for weird positions, too many fingers, extra-long digits or hands out of place.
🔎 Pro tip: Before you hit “share” on any image or video, try a reverse image search. Open Google and click Images at the top. You can drag and drop or upload a photo from your desktop.
Also, if you don’t see the pic in question elsewhere, that’s a bad sign. Legitimate photos are going to end up on a lot of reputable sites quickly.
I’m sharing this because I want you to stay safe out there. Now, do the people in your life a favor and share this story to help them out, too. Use the buttons below to make it easy.
Tags: apps, camera, Deepfake, deepfake videos, Google, hackers, malware, photos, scam, social media, video, videos