Journalism increasingly uses AI to gather news, begging the question: does mingling algorithms with a human touch lead to trustworthy news? Here’s my exploration.
Trustworthy news is the foundation of journalism. During the class I am teaching at The University of Groningen, we’ve discussed how the public’s trust in the media in our digital age keeps declining. While global trust in journalists hovers around 59%, In the USA trust has fallen from 68% (1972) to 41% today.
This means journalists have their work cut out for them. Without trust, you don’t have readers.
This is where AI, with its phenomenal computing power, enters. Reuters started using AI-powered news in 2017, and several big-name publishers, such as Forbes, The Guardian, AP, and the Washington Post quickly followed suit. In the newsroom, algorithms are used to scan data for insights, generate templates, and create content for stories dominated by structured data, like financial news or sports. Journalists eventually flesh these rough drafts out (pun intended), but it takes much less time than before.
It goes without saying there are pros and cons, so let’s start with the positives.
As we all know, AI is like taking millions of minds and harnessing their input together. It doesn’t need a triple expresso while scanning mountains of repetitive information, and it can analyse volumes of info in seconds, forever. Because bots can generate more stories than humans (and they do so accurately), this gives reporters more time to focus on creativity and critical thinking—the stuff of award-winning journalism.
As for its cons, AI generated content is seen as objective—after all, machines don’t have opinions, right? But imperfect, fallible humans create algorithms and knowingly or not, they can reflect our bias. It’s been proven that IBM and Microsoft AI systems have a large gender and racial bias and Microsoft’s Tay, learned to parrot racist conversation in less than a day.
AI is as trustworthy as its creator and the input it’s been given. If we actively keep our eyes on the ball, we should be able to keep it trustworthy.
For now, AI has been built to recognise patterns, perform and produce (predictable) content, but it can’t replace creativity, having a hunch and following it through. Or interviewing people to get a uniquely human take on a situation. AI is not creative—not yet. As Meredith Broussard, a data journalism professor at the Arthur L. Carter Journalism Institute, puts it, “[Creating] news is not the same as producing a car on an assembly line.”
I think we can still keep our jobs for now, though time will tell. It’s a very human tendency to fear the new, and journalism has continually faced technological advances that it initially feared. Things like photography, radio, tv and the internet. But newsrooms eventually incorporated them into new reporting and this has ultimately expanded our ability to tell stories.