Sound of deception: Why audio Deepfakes are more dangerous than video
Imagine you are in a video call where you are just introducing yourself. The person on the other side has already sought permission to record the interaction. You being naive to the range of possibilities agreed to it. It’s not uncommon for companies to record their Zoom meetings for future reference. In fact, it’s the norm until you get shocked by what comes next. Simply by recording your voice samples
Electoral manipulation and public deception
Deepfakes are extraordinary technology that can replicate the original so impeccably that it appears high on credibility and therefore prone to misuse. Among Deepfakes, audio and video are sought-after while manipulating. Last year, within hours before the closure of polls in Slovakian elections, an audio fake of one of the candidates claiming to have rigged the election went viral. Within hours, enraged voters lined up in the polling booths in huge numbers attempting to reverse the rigging. They ended up delivering a heavy defeat to that candidate. The Deepfake audio stirred the opposing forces to act fast and unitedly changed the electoral fortunes of the candidate. This year, London mayor Sadiq Khan was targeted with fake audio of him making inflammatory remarks calling for pro-Palestinian marches.
Pervasive threat of audio deepfakes
While audio and video deepfakes are equally harmful in ruining hard-earned reputation
In the recent Indian elections, Deepfakes were used extensively to influence the poll outcomes
The rising threat of audio deepfakes
Audio is equally potent to misuse. In January, a voice message in the sound of US President Joe Biden went viral. The audio urged the voters in the New Hampshire to not vote in the Democratic Primary and instead “save your vote for the November election”. The message was alarming as it demonstrated the ease with which it was edited, cheap manner it was produced and difficulty in tracing the creator. It demonstrates that all it takes is a notorious individual, a free software, and a big database to wreak havoc on the election process. In the Deepfakes ecosystem, audio is beating video in terms of impact. It’s only a matter of time that AI develops to produce equally cheap and easy to edit versions in the video format. Till then, audio will remain the more destructive threat.
领英推荐
About pi-labs
At
pi-labs.ai
, we are on a mission to keep the internet clean using our advanced deepfake detection tool
Try Authentify: https://pi-labs.ai/try-authentify/
Schedule a demo call with our team: https://calendly.com/pi-labsdemo/30min?month=2024-06
?
Ankush Tiwari
Founder & CEO, pi-labs