Is it real? The challenge for journalists as content manipulation and fake material improves
One of the first senior editor that I worked for had an eye for detail. He was a wiry, direct man with a gruff northern accent. He worked alongside me at network radio, and it was his job to check all the scripts before they were broadcast. He wanted them short, sharp and accurate. But he'd usually leave it until last minute to check them, which was just enough time. Except if anything was wrong. When that happened, he'd jump up, pierce a look at the producer who'd written the script and bark at them “Hey - is that true?”
He was doing what all of us in news do; Fact-checking; Preserving trust and integrity; Asking a fundamental question at the heart of every story.
Today, working in news comes with a second question too.
Is it real?
We live in a world where bad actors have access to incredibly good software. Where millions of messages are generated every minute, and information sits on the same scroll bar as misinformation and disinformation. Where the line between touch up and manipulation is hard to call.
Even a Princess is prepared to do it. And get it past her comms team. Who passed it to the photo agencies, who shared it with the press and the public for hours before issuing kill notifications.
The Princess apologised of course… she hadn’t meant to mislead. But others do.
And other content can be a complete creation. And it can be believed. Last year, a synthetic image of an explosion outside the Pentagon briefly moved markets.
Right now, these stories are thankfully pretty rare in mainstream newsrooms. But there’s plenty of bots and bad actors who are prepared to push the limits.
The fakes are getting better, the technology to make them is improving too and it’s getting easier and easier to use.
The amount being produced is probably impossible to gather – a qualified estimate puts it at 34 million a day.
A recent test by the University of Sydney, found that people were only prepared to verbally identify around a third of the deep fakes put infront of them.
For many, AI is something to play with. It’s fun. It’s creative. It’s inspiring. It’s certainly full of opportunities for all of us.
But as we build our abilities to change and create, was also need ways to identify what has happened. In news, we need to be able to separate fact from fiction.
Six months ago, I joined an accelerator group conveyed ahead of this year's IBC that was put together to see what could be done to combat the fakes. How can we go about protecting ourselves and our audiences, without breaking our budgets? How can we work with the same pace and precision? If technology can fool us all, can it protect us too?
Of course, verifying content, sources and scraps of information is nothing new in journalism.? But in the past decade, digital verification has become an increasingly important part of our jobs.? Some news organisations now have whole departments or units specifically focused on it.? But with so much potentially fake content being generated every day, we wanted to see whether there was value in newsrooms sharing details of fake content that they had already debunked.
So across a month from mid- July to mid-August, 7 news organisations set up a Slack channel where we could share information about suspect content that we’d either debunked or wanted help verifying.?
Participation was completely voluntary, so no groups had to share anything that they weren’t comfortable with their competitors seeing.?
We created a simple form that included a media link for the suspect content and notes on what the newsroom had discovered.
领英推荐
Among the clips to come in was a video of a man claiming to be the gunman who tried to kill Donald Trump… suggesting that he was still alive and hadn’t carried out the shooting.
US politics dominated headlines during the trial period – and prompted disinformation.? One candidate got shot, a second resigned and a third was put forward.? It was quite a month.?
Meanwhile, in the UK – as a direct result of disinformation – an attack on a group of children in Southport morphed into violence and rioting in different parts of the country.
You might have thought that we would have been awash with fakery on our shared Slack account.
And yet… we weren’t.? And what’s more, most of us weren’t surprised.?
We started discussing why that might be…. Especially as we continued to see a big interest in our work, with very little drop-off among the people at our meetings.?Although it wasn’t always senior leaders on the call, newsrooms continued to send people, because this mattered to them.
And here’s what we concluded:
Fake content and disinformation spreads on social media where it can gain huge traction without being debunked or taken down.
News organisations ignore most fake content. There is too much to debunk it all.? And bluntly speaking, a lot of it on social media isn’t very good yet. Just look at the comments below and you’ll quickly see that others aren’t being duped either.? That won’t last.
Journalists operate with caution. Fake and genuine content is checked, taking time and allowing disinformation to spread through unverified means. Even when they do debunk the content, their primary publishing point probably isn’t the platform where it started
That means mis- and dis-information can spread without authoritative challenge, causing confusion and affecting narratives.
That is a challenge for all of us.?
There is a structural problem here.? We are setting content free without any way of proving its provenance.? We are building up our abilities to manipulate and distort without giving ourselves a way out.?
We are asking for forgiveness not permission.
And we are giving different audiences very different outcomes.? If you don’t use social media for news, you get a very different view.
As broadcasters, we could look at this eco-system and take the view that we need to protect ourselves… that protecting our output will maintain our audiences trust and our brand identities.
But as journalists – particularly those of us with public service beliefs – we shouldn’t ignore the importance of solving this for everyone.
Those conversations led us to a call to arms.
It is for all of us, whether in broadcasting or big tech or social media to collaborate on this. We all need to work together closely to make sure everyone has access to information they can trust. We all need to develop a deeper understanding of the products and systems available to fight disinformation and fake content.
Trust wins hearts and minds. It is built through openness and transparency. As a group, we should aspire to cut through the disinformation together, without curtailing creativity and invention.?
All of us – all our audiences – all our societies - will benefit if that’s what happens.
Head of Technology Forecasting, BBC, talks about technology, obsessive about AI
6 个月Absolutely excellent presentation, Tim.
Managing Editor, ITV News at ITN
6 个月Great piece Tim !