AI Research Shows Automated Journalism More Trusted When Paired With Human Touch
A story written by a journalist with assistance from artificial intelligence — the man/machine model – is perceived as the least biased and most credible news story, and organizations that provide that information about automation in a byline at the top of the story provide readers with much-needed transparency.
Those were the findings of two studies conducted by Frank Waddell, assistant professor in the University of Florida College of Journalism and Communications Department of Journalism and a Consortium on Trust in Media and Technology Trust Scholar.
As AI-driven, automated journalism is increasing in popularity, more and more news organizations are using it for a variety of different types of stories leading to more research on how audiences feel about it.
“Some scholars have found people are totally unaware so they don’t even notice it,” he said. “Other research has found people are a little suspicious of it, kind of like people often are suspicious of automation and how it’s used in a new area, just like robots driving cars. They don’t necessarily like the idea of robots writing news stories either. Others found people are open to it.”
Waddell designed his research to find out why people’s responses varied so much and how media companies could improve the acceptance of and positive responses to automated journalism. He found that automated journalism shouldn’t be treated as either it is or it isn’t.
“It’s kind of a false dichotomy,” Waddell said. “Algorithms have to come from somewhere. They are written by a person and are still being used by journalists. Some people say it’s better to describe algorithmic news as a man/machine marriage, where the two are working together rather than being separated.”
In his study, Waddell asked participants to read one of three news articles, varying only the byline: a traditional journalist, automated story or the two working together. He found that calling readers’ attention to collaboration between humans and algorithms provides the best of both worlds. “There is a decreased bias that we associate with algorithms while still getting that human touch that we expect from human reporters,” Waddell said.
By comparison, credibility is reduced when automation is used in isolation.
To be clear, algorithms are not writing news stories on their own — they are more like a Roomba, not as sophisticated as well-known robots like the Terminator and WALL-E. Instead, journalists are using them mostly to automate data collection. For instance, sports reporters could use algorithms to collect and sort player and team statistics to leave them more time to write their stories. Automation also is producing some weather and financial reports, Waddell said.
“It’s taking a lot of drudgery away from journalists,” he said. “Before, they would sit down with a spreadsheet and put all the details in a story by themselves. This is very conducive for automation. There is a template they can work from.”
Another study he conducted shows readers appreciate finding out at the beginning of a story — not at the end — that they are reading something partially or fully produced using automation. Placing it at the end “violated expectations,” Waddell said. “They are not expecting to be reading something produced by an algorithm, and when you only find out at the end, it’s a little more disruptive than when you processed it from the beginning with that in mind.”
News organizations will benefit from the research by being transparent, telling readers how stories were produced using traditional and automated means, he said. “It has positive consequences on how news is evaluated, particularly concerning the perceived credibility of an article.”
Tagged as: AIatUF, Artificial Intelligence, Frank Waddell, News