Protecting older adults from AI-powered deepfake attacks

TIAA’s Global Cybersecurity and Fraud Management team uses artificial intelligence (AI) and other sophisticated technologies to more quickly and accurately detect cyberattacks. We do this to protect our clients’ data and accounts, strengthen how we operate, and lead in lifetime. With this in mind, today we’re releasing a forward-looking article focused on the rapid growth in audio and video deepfake cyberattacks powered by AI – many of which target older adults. We delve into various types of voice and video deepfakes and expound on how they’re used in romance, tech support, investment and robocall scams. The article also suggests practical actions older adults can take to detect and avoid these crimes.

?Imagine this hypothetical scenario: You’re older and retired, enjoying a nice relaxing day when your smartphone rings.?

?“Grandpa, it’s John, your grandson. I’ve been in a bad car accident. It was my fault. I’m being taken to jail. I need you to wire me $10,000 as bail money. It’s really urgent. And please don’t tell anyone. I’m so scared. I’m so sorry, Grandpa.”?

?The voice on the phone sounds exactly like your grandson. You’re worried and you want to help. You scramble to find a way to quickly send him money.??

?You have no reason to believe the call is fake. But it is. You’re the target of a voice cloning deepfake romance scam, and the romance in this case is your close relationship with your grandson. These scammers present false stories and have malicious intent. Often, they want to steal your money. These crimes are happening on a broad scale, and in a variety of methods, especially against older adults.??

?Tips: If your grandson calls you saying he’s in distress and needs money, call him back at his contact number or get in touch with another family member or close friend for support and to share what has happened. Family members should agree upon a code word to ask each other that will confirm their authenticity when calling. This way, you can say the code word if you have doubts, and if the person doesn’t know it, that’s your red flag it’s a deepfake.?

?

Tech support deepfake?

?

Here’s another situation to watch out for. You get an unexpected phone call at home from a person claiming to be with Microsoft or some other high-tech company. The caller informs you your PC needs a software upgrade immediately or it will get infected with a bug. The voice sounds like someone official and asks for your password to log in to your PC. If you share that password with them, you’ll become the victim of a tech support scam using voice deepfake technology.??

?

These attacks are widespread. In the United States, tech support fraud was the number one crime type impacting complainants over 60, with nearly 18,000 complaints and almost $600 million in reported losses, according to the FBI.?

?

Tip: If an IT person calls you by surprise and you don’t know who they are, tell a family member about the call to find out what you should do. It’s probably not legitimate, and you should not give the caller any information.?

?

Investment voice deepfake?

?

Now envision a situation in which you suddenly get a phone call from someone saying they’re an investment professional. Sounding official and legitimate, they urge you to send them money so they can invest it in cryptocurrency or another type of “great” opportunity. You send them the money and never see it again. This is a voice deepfake investment scam.?

?

Tip: Be cautious of urgent requests for money, especially made in a strange or unconventional manner. Listen for use of vocabulary or tone that sounds odd. Be wary of any voice call you receive asking for money for any reason from anyone. If a person you don’t know calls you about an investment opportunity, don’t engage.?

?

Be vigilant to avoid “robocall”?

?

You may also be the target of a “robocall” voice cloning deepfake. In this type of scam, a bad actor will call and ask, “Can you hear me?” Once you answer “yes,” they’ll have your voice recording. They’ll hang up and potentially use that as authorization to make a payment on a credit card they stole from you.?

?

Tip: If someone you don’t know calls you and asks right away, “Can you hear me?”, don’t respond and allow them to record your voice.?All of these scenarios underscore how pervasive and insidious voice deepfakes are becoming.?

?

Listeners have less contextual clues when [hearing] a recording versus watching a video, where odd facial expressions, video glitches or incorrect anatomical or background details could tip a viewer off, according to?Bradley, a national law firm.?

?

Video deepfakes?

?

Although we’ve been focusing on voice deepfakes because they’re so widely used against older adults, you also want to be vigilant to spot and avoid video deepfakes.?

?

In one such scenario, you may be sent a video from a person claiming to be an Internal Revenue Service (IRS) agent. They may look official and say they need you to send the IRS an immediate payment for taxes you owe, and threaten to arrest you if you don’t.??

?

Tips: An IRS official won’t send you a video asking you to do anything. They don’t do business this way. Don’t engage with the person. You might ask a family member to help you report the incident to the Federal Bureau of Investigation.?

?

Alert: These organizations may be sending you deepfakes?

?

When you get a call you didn’t expect from any of these organizations listed below, you might be the victim of a deepfake. Fraudsters often use these brand names to come across as more convincing.?

?

  • U.S. Postal Service?

  • Wells Fargo??

  • WhatsApp?

  • UPS?

  • Amazon?

?

Targeting of older adults?

?

Scammers target older adults with deepfakes because they tend to be more trusting and receptive to personalized voice and video calls.?

?

“Older people may be particularly vulnerable due to the added layer of personalization being used by perpetrators through the AI-generated impersonation of loved ones,” according to Dinesh Napal of the American Bar Association. “This deepens existing vulnerabilities older people may have to financial scams. They may be even more likely to act on the scammer’s request, or at least take fewer steps to discern whether the request is genuine, because they may be motivated by fear of harm or distress being inflicted on their loved one.”?

?

Complicating this further, it’s generally more difficult for older adults to detect deepfakes than younger adults, according to a Center for Strategic & International Studies report. Some 58% of 20-year-olds can detect audio deepfakes compared with 53% of 60-year-olds and 49% of 80-year-olds.??

?

Upside: Detecting deepfakes?

?

Despite these challenges, you can take steps to avoid voice and video deepfake attacks. It’s best to educate yourself, be aware these deepfakes may be targeting you, and be careful about what people send you unexpectedly asking for an urgent response. It’s prudent when such things occur to slow down.?

?

Detecting voice cloning deepfakes?

?

Here’s a list of voice deepfake red flags to look out for:?

?

  • Listen for subtle discrepancies, strange tones or odd, interrupted voice sounds. Do you hear a robotic-sounding voice? AI voices often make awkward pauses, clip words short, or put unnatural emphasis in the wrong places.?
  • Analyze voices for inconsistent noises and unnatural speech patterns.?
  • Avoid posting long voice recordings of yourself on any publicly available sources.?

?

Detecting video deepfakes?

?

There are several steps you can take to avoid video deepfakes.?

?

  • When you see a video that seems odd, question whether it’s a real video and if what this person is saying makes sense.?
  • Don’t click into websites or attachments within suspicious videos. You’ll avoid malicious software being downloaded into your PC and smartphone.?
  • If you’re not sure whether a video is real, exit the email and search for the website of the person or organization sending it. Discern if it’s a legitimate organization and if they posted the video on their site or if the company even exists.?

?

Examine faces, hands, lighting?

?

Detecting deepfakes is about looking for clues that something is not right. Examine the video for these specific deepfake characteristics:?

?

Face?

?

  • Odd shaped?

  • Face and voice misalignment?

  • Limited movement?

  • Face and neck size discrepancies?

  • Higher-quality video of face than rest of video (grainy/blurry)?

?

Eyes?

?

  • Eyes misaligned with direction of eye gaze?

  • Eyes converge and gaze on a single point (overly focused)?

  • Eyes uncorrelated and look painted or glued on?

  • Unnatural or no blinking?

  • Different shades of eye color?

  • Distorted shapes of glasses?

  • Unusual and insensible glare in glasses?

  • Glasses that form unrealistic shadows?

?

Hands/Fingers?

?

  • Too many fingers?

  • Oddly shaped and disproportionately sized?

  • Unusual size proportions between fingers and hands?

  • Abnormally small or large hands?

  • Hand gestures don’t match what person is saying?

?

Lighting/Shadows?

?

  • Lighting on face seems illogical (too dark or too light or misplaced)?

  • Background lighting/glare doesn’t match natural lighting?

  • Unusually placed, sized, or configured shadows?

  • Background lighting doesn’t sync with lighting on face or surroundings?

?

Lips?

?

  • Lip movement off-kilter with audio?

  • Speaker’s voice doesn’t fit with lip movement?

  • Size and color of lips don’t match speaker’s face?

  • Lip movements lack natural human pauses?

?

Skin?

?

  • Unnatural skin tone?

  • Age incongruity between skin, hair, and eyes?

  • Facial skin tone doesn’t match body skin tone?

?

Facial Hair?

?

  • Overly smooth hair?

  • Wrinkly eyes?

  • Facial hair looks glued on?

?

Motion?

?

  • Discontinuous, herky-jerky, and unnatural movements?

??

Deepfake stats?

?

Statistics illuminate the magnitude of the deepfake problem older adults are facing. Here are some of the most salient.?

Of the people who encountered or were victimized by a deepfake scam:?

?

  • 31% say they have experienced some kind of AI voice scam?
  • 27% of people feel confident they would be able to identify if a call from a loved one or friend was in fact real or AI-generated?

  • 53% say AI has made it harder to spot online scams?

  • 43% say they’ve seen deepfake content?

  • 26% have encountered a deepfake scam?

  • 9% have been a victim of a deepfake scam?

?

McAfee??

?

  • From 2022 to 2023, there’s been a 10-fold increase in the number of deepfakes detected globally across all industries?

?

Sumsub???

?

Disclaimer: This is provided to you for education and initial awareness; because this is a rapidly changing field, we cannot assure you that it is complete or that it addresses your specific circumstances. We urge you to remain informed and vigilant.?

?

Naresh Kumar Bathala

SDM - Amazon ||Ex Tech Executive - Medidata || Ex Director - Architecture@VISA || Ex- Amex || Mentor ||C|CISO|| Startup Advisor || Public Speaker || Leadership || CyberSecurity Architecture || FinTech -Payments

12 小时前

Boss I guess one more patent cooking in ur kitchen.

Dr. John Rares Almasan

Cloud & AI Technology Executive | Public Speaker | Academic Lecturer | Ex McKinsey & Co. and Amex

16 小时前

Thanks, Upendra Mardikar, for writing and sharing this great paper. It’s an must read with a strong point—we all need to help protect our elders from AI scams.

vennela vigrahala

CISO Field Associate | Alert AI, the end to end GenAI Application Firewall

1 天前

Alert AI is end to end GenAI Application security platform, AI agents for Security Operations and Workflows, and end-to-end, interoperable GenAI security platform to secure GenAI applications, AI & data privacy controls. With 10s of services, 100s of Integrations, 1000s of detections Alert AI differentiates from any other AI Access security solution.Upendra Mardikar. Great Post??

回复
Rambabu Achanta

Head of Engineering Information Technology, PT Bank Mandiri (Persero) Tbk

2 天前

Very insightful and thanks for sharing Upendra.

Ashok Kumar

Meta - software Engineering Building Payments at Scale

2 天前

very informative and useful !!

要查看或添加评论,请登录

Upendra Mardikar的更多文章