Three Examples of How Fraudsters Used AI Successfully for Payment Fraud – Part 2: Deepfake Video
Institute of Financial Operations & Leadership (IFOL)
We exist to support your career advancement in Finance Operations (P2P/O2C, Accounts Payable/Receivable and Leadership)
A Real Case of Payment Fraud by Deepfake Video
By Debra R. Richardson, MBA, CFE, CAPP
According to?IBM, Generative Artificial Intelligence (AI) tools refers to deep-learning models that can generate high-quality text, images, and other content based on the data they were trained. Using AI tools appear to have unlimited capabilities to make most of what you do easier and more efficient – both at home and at work.?You and many of your colleagues may be waiting for your companies to approve access to AI tools.?Fraudsters are not waiting.?AI tools are helping fraudsters’ requests appear legitimate and its proving successful in perpetrating payment fraud.
In this three-part fraud series, we are looking at three real cases of payment fraud using AI tools and identifying how you can mitigate that fraud.?
Victim Loss $25.6M: Are Those Really Your Colleagues On That Video Call?
A CNN article, entitled?Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’? Why was the payout made??The article stated that the finance worker received a suspicious email he suspected was a phishing email because it asked for a “secret transaction”. However, the payment was made after a video conference was attended by multiple internal employees he recognized. The problem is that the fraudsters used AI tools to generate videos of the employees. All that is needed to create the deepfake is to obtain a sample video and voice of the victim. Since many finance leaders appear on podcasts, webinars, videos and other public content, samples of them on video and audio is available to fraudsters.?
If your financial operations leaders and accounts payable and vendor team members work remotely, or in different offices, you may be relying on video conference calls to conduct business. According to?Notta.ai?between 2022 and 2022 virtual meetings have increased from 48% to 77% with 43% preferring on-camera.? ??
While some videos generated by AI contain a watermark, others do not.?Best practice is to understand how to mitigate this emerging fraud trend.?? ?
领英推荐
How can you Mitigate AI Deepfake Videos to Avoid Payment Fraud?
Staying Safe
Fraudsters are improving their tools to their fraudulent requests appear legitimate in order to perpetrate payment fraud. And it’s working.?The best way to mitigate AI deepfake videos is to avoid relying only on video conference calls to approve payments.?Implement authentication techniques, internal controls, best practices, and vendor validations in the accounts payable, vendor setup and maintenance and payment process.?
Check back for Part 3 of this series: FraudGPT