Protecting older adults from AI-powered deepfake attacks
Upendra Mardikar
EVP, Chief Information Security Officer at TIAA. Author, Advisor, Mentor
TIAA’s Global Cybersecurity and Fraud Management team uses artificial intelligence (AI) and other sophisticated technologies to more quickly and accurately detect cyberattacks. We do this to protect our clients’ data and accounts, strengthen how we operate, and lead in lifetime. With this in mind, today we’re releasing a forward-looking article focused on the rapid growth in audio and video deepfake cyberattacks powered by AI – many of which target older adults. We delve into various types of voice and video deepfakes and expound on how they’re used in romance, tech support, investment and robocall scams. The article also suggests practical actions older adults can take to detect and avoid these crimes.
?Imagine this hypothetical scenario: You’re older and retired, enjoying a nice relaxing day when your smartphone rings.?
?“Grandpa, it’s John, your grandson. I’ve been in a bad car accident. It was my fault. I’m being taken to jail. I need you to wire me $10,000 as bail money. It’s really urgent. And please don’t tell anyone. I’m so scared. I’m so sorry, Grandpa.”?
?The voice on the phone sounds exactly like your grandson. You’re worried and you want to help. You scramble to find a way to quickly send him money.??
?You have no reason to believe the call is fake. But it is. You’re the target of a voice cloning deepfake romance scam, and the romance in this case is your close relationship with your grandson. These scammers present false stories and have malicious intent. Often, they want to steal your money. These crimes are happening on a broad scale, and in a variety of methods, especially against older adults.??
?Tips: If your grandson calls you saying he’s in distress and needs money, call him back at his contact number or get in touch with another family member or close friend for support and to share what has happened. Family members should agree upon a code word to ask each other that will confirm their authenticity when calling. This way, you can say the code word if you have doubts, and if the person doesn’t know it, that’s your red flag it’s a deepfake.?
?
Tech support deepfake?
?
Here’s another situation to watch out for. You get an unexpected phone call at home from a person claiming to be with Microsoft or some other high-tech company. The caller informs you your PC needs a software upgrade immediately or it will get infected with a bug. The voice sounds like someone official and asks for your password to log in to your PC. If you share that password with them, you’ll become the victim of a tech support scam using voice deepfake technology.??
?
These attacks are widespread. In the United States, tech support fraud was the number one crime type impacting complainants over 60, with nearly 18,000 complaints and almost $600 million in reported losses, according to the FBI.?
?
Tip: If an IT person calls you by surprise and you don’t know who they are, tell a family member about the call to find out what you should do. It’s probably not legitimate, and you should not give the caller any information.?
?
Investment voice deepfake?
?
Now envision a situation in which you suddenly get a phone call from someone saying they’re an investment professional. Sounding official and legitimate, they urge you to send them money so they can invest it in cryptocurrency or another type of “great” opportunity. You send them the money and never see it again. This is a voice deepfake investment scam.?
?
Tip: Be cautious of urgent requests for money, especially made in a strange or unconventional manner. Listen for use of vocabulary or tone that sounds odd. Be wary of any voice call you receive asking for money for any reason from anyone. If a person you don’t know calls you about an investment opportunity, don’t engage.?
?
Be vigilant to avoid “robocall”?
?
You may also be the target of a “robocall” voice cloning deepfake. In this type of scam, a bad actor will call and ask, “Can you hear me?” Once you answer “yes,” they’ll have your voice recording. They’ll hang up and potentially use that as authorization to make a payment on a credit card they stole from you.?
?
Tip: If someone you don’t know calls you and asks right away, “Can you hear me?”, don’t respond and allow them to record your voice.?All of these scenarios underscore how pervasive and insidious voice deepfakes are becoming.?
?
Listeners have less contextual clues when [hearing] a recording versus watching a video, where odd facial expressions, video glitches or incorrect anatomical or background details could tip a viewer off, according to?Bradley, a national law firm.?
?
Video deepfakes?
?
Although we’ve been focusing on voice deepfakes because they’re so widely used against older adults, you also want to be vigilant to spot and avoid video deepfakes.?
?
In one such scenario, you may be sent a video from a person claiming to be an Internal Revenue Service (IRS) agent. They may look official and say they need you to send the IRS an immediate payment for taxes you owe, and threaten to arrest you if you don’t.??
?
Tips: An IRS official won’t send you a video asking you to do anything. They don’t do business this way. Don’t engage with the person. You might ask a family member to help you report the incident to the Federal Bureau of Investigation.?
?
Alert: These organizations may be sending you deepfakes?
?
When you get a call you didn’t expect from any of these organizations listed below, you might be the victim of a deepfake. Fraudsters often use these brand names to come across as more convincing.?
?
?
Targeting of older adults?
?
Scammers target older adults with deepfakes because they tend to be more trusting and receptive to personalized voice and video calls.?
?
“Older people may be particularly vulnerable due to the added layer of personalization being used by perpetrators through the AI-generated impersonation of loved ones,” according to Dinesh Napal of the American Bar Association. “This deepens existing vulnerabilities older people may have to financial scams. They may be even more likely to act on the scammer’s request, or at least take fewer steps to discern whether the request is genuine, because they may be motivated by fear of harm or distress being inflicted on their loved one.”?
?
Complicating this further, it’s generally more difficult for older adults to detect deepfakes than younger adults, according to a Center for Strategic & International Studies report. Some 58% of 20-year-olds can detect audio deepfakes compared with 53% of 60-year-olds and 49% of 80-year-olds.??
?
Upside: Detecting deepfakes?
?
Despite these challenges, you can take steps to avoid voice and video deepfake attacks. It’s best to educate yourself, be aware these deepfakes may be targeting you, and be careful about what people send you unexpectedly asking for an urgent response. It’s prudent when such things occur to slow down.?
?
Detecting voice cloning deepfakes?
?
Here’s a list of voice deepfake red flags to look out for:?
?
?
Detecting video deepfakes?
?
There are several steps you can take to avoid video deepfakes.?
?
?
Examine faces, hands, lighting?
?
Detecting deepfakes is about looking for clues that something is not right. Examine the video for these specific deepfake characteristics:?
?
Face?
?
?
Eyes?
?
?
Hands/Fingers?
?
?
Lighting/Shadows?
?
?
Lips?
?
?
Skin?
?
?
Facial Hair?
?
?
Motion?
?
??
Deepfake stats?
?
Statistics illuminate the magnitude of the deepfake problem older adults are facing. Here are some of the most salient.?
Of the people who encountered or were victimized by a deepfake scam:?
?
?
?
?
Sumsub???
?
Disclaimer: This is provided to you for education and initial awareness; because this is a rapidly changing field, we cannot assure you that it is complete or that it addresses your specific circumstances. We urge you to remain informed and vigilant.?
?
SDM - Amazon ||Ex Tech Executive - Medidata || Ex Director - Architecture@VISA || Ex- Amex || Mentor ||C|CISO|| Startup Advisor || Public Speaker || Leadership || CyberSecurity Architecture || FinTech -Payments
12 小时前Boss I guess one more patent cooking in ur kitchen.
Cloud & AI Technology Executive | Public Speaker | Academic Lecturer | Ex McKinsey & Co. and Amex
16 小时前Thanks, Upendra Mardikar, for writing and sharing this great paper. It’s an must read with a strong point—we all need to help protect our elders from AI scams.
CISO Field Associate | Alert AI, the end to end GenAI Application Firewall
1 天前Alert AI is end to end GenAI Application security platform, AI agents for Security Operations and Workflows, and end-to-end, interoperable GenAI security platform to secure GenAI applications, AI & data privacy controls. With 10s of services, 100s of Integrations, 1000s of detections Alert AI differentiates from any other AI Access security solution.Upendra Mardikar. Great Post??
Head of Engineering Information Technology, PT Bank Mandiri (Persero) Tbk
2 天前Very insightful and thanks for sharing Upendra.
Meta - software Engineering Building Payments at Scale
2 天前very informative and useful !!