Journalism in the days of deep fake: How Artificial intelligence can control a false narrative
Fola Daniel Adelesi
International Keynote Speaker | Training Facilitator | PR & Comms | Corporate Event Host
Let me start by hitting the nail on the head. The truth is that Artificial Intelligence will, on one hand, make your work easier and on the other hand, make it much more difficult. I don’t think journalists and media communications professionals understand what’s about to hit them in the coming days.
As you leverage AI to get things done faster, you need to be aware of those who will use AI for all the wrong reasons including trying to control a false narrative in the media. So you can use AI to write, create ads and also create videos. Now the bigger problem is the fact that AI can also be used to doctor videos.
When I say doctor videos, I am not even talking about creating videos from scratch with original content. I am talking about situations where AI studies the way someone talks, the person’s mannerisms, and other gestures to be able to create another video over time with a different text.
In other words, if I wanted to create a video and make people believe you are the one who read this article you’re reading in that video, I would get a couple of your videos and allow AI to study them. I would then feed the AI a script different from the things that you ever said and the AI will show your face and voice but you’re reading my own script.
What’s the implication? In this world where we mostly rely on video evidence to prove certain things, we may be coming to a time when just having a video will not mean anything anymore. If you’re a journalist or you’re in the media space generally, it will no longer be enough to have a video to back up whatever is it you’re using the video for. You now have to be sure of the source of the video and prove that the video was actually created by the personality, media organization or institution you’re crediting the video to.
If you are not able to do this, then you would have ignorantly joined the bandwagon of those spreading fake news. Yes! You can now be a professional who is ignorantly spreading fake news. How would you feel to find out that the video you’ve been pushing around was never recorded by the person you’ve been crediting or that the person you are trying hard to quote never said the things you thought they said?
A few years ago, someone released a video of President Barack Obama with about four or five faces. All faces were saying exactly the same thing and the mouths were moving at exactly the same pace. The person who posted the video then asked a simple question. Which one do you think is the original face of President Obama? It was difficult to identify. Most of the people watching wouldn’t know. Even some geeks – IT professionals – may not quickly spot the difference or what’s wrong with the video. That’s how far with come with the use of AI to create videos. We can create what does not exist or use someone’s old video and replace it with a script. So, it’s my face and voice you’re hearing but it’s AI that is actually doing the talking.
Just a few days ago, I saw another video of President Biden speaking against abortion. If you understand anything about the Democrats and the Republicans as parties in America, you will know that generally speaking, Democrats are pro-abortion while Republicans are pro-life. It does not mean that there is no exception. This is just a general tendency. So when you see a video of President Biden speaking against abortion, you definitely know that something is wrong somewhere On a close look, it was obvious the video was doctored. The mouth moved almost at the exact pattern of the script that the AI had fed the speaker. You may not have noticed this if you don’t look closely or if you’re in a hurry to prove a point.
What does this mean for Journalism and Communications?
1. It means that more than ever before, you cannot believe everything you see online even if it is a video. Video evidence has to be scrutinized.
2. Now that you are aware or have been reminded that videos can be doctored, don’t be in a hurry to share videos or refer to them as evidence
3. Just before you share the video, just imagine if it was your face and voice that were fed a script that you never read
4. There are mischievous people all over the place who will go this far to prove a point. They often achieve this by spreading their false narrative through unsuspecting people.
领英推荐
5. Be more focused as a Communications professional or journalist on authenticity than publicity or traffic to your web pages which can then be converted to cash for you.
6. If you think you’re just doing your job you need to rethink because you may have joined a few others to destroy the job of someone else without real proof.
What should you do when you find a video that you intend to quote as evidence or use for your jobs?
1. Question the source of the video particularly if you’re going to use it for credibility.
2. If the video does not link back to someone credible, a website that can be quoted, or organization that can take the credit, please hesitate in using the video.
3. Watch the video repeatedly before using it.
4. Watch out for moments when the mouth and the voices don’t match. I don’t there could be other genuine reasons for this. This can happen during editing and it has happened to some of the tons of videos I have recorded for my youtube channel but it is not always the case.
5. Watch out for blurry videos. Blurry does not automatically mean fake but people can hide under the guise of blurry and old videos to push a false narrative.
6. As for clean and new videos, watch a few more times and try to find out the source of the video.
7.When you’re not sure about the authenticity of any video, try to look for alternative materials to quote. This is why I said that Ai can also make your work more difficult. Before now, all you needed was a clean and audible video as proof. Today, that’s no longer enough. You have to be sure the person you’re seeing in the video actually said what you’re hearing.
AI is here to stay. It’s going to make some things easier and it is going to make some things a bit more difficult. This issue scares me because a fake video, if not investigated, can destroy someone’s career or worse still, send someone to jail. Don’t be the one who uses AI to destroy others. Let’s use it for the good thing that our profession is known for – communication.
Fola Daniel Adelesi
FDA Speaks professionally, Trains Effectively, Delivers great keynotes & Comperes humorously | [email protected] | Subscribe to YouTube.com/FolaDanielSpeaks
GM Universal Television Africa (UTV Africa)
1 年????????????????