Newsrooms Should Engage Responsibly with Artificial Intelligence, Say Journalists
Internal AI policies and more informed coverage could help news outlets adapt.
Jake Neenan
WASHINGTON, August 28, 2023 – Newsrooms should take an active role in crafting artificial intelligence practices and policies, experts said on August 17 at a webinar hosted by the Knight Center for Journalism in the Americas.
Waiting too long to institute policies around the application of AI in the news gathering process and the use of newsroom data and content for AI research could allow tech companies to dictate these on their terms, said Amy Rinehart, a senior program manager for local news and AI at the Associated Press.
“Big tech came in and told us how the internet was going to work, and we have abided by the rules they’ve set up,” she said. “If we don’t get in there and experiment, they’re going to write the rules.”
Seven tech companies met with the White House in July to work out terms of a voluntary commitment to public safety measures in their AI research and products.
Increased AI literacy will improve future coverage of the technology, according to Rinehart. She said coverage has largely been sensational because of the news industry’s discomfort with the potential automation of some of their work.
Sil Hamilton, an artificial intelligence researcher at McGill University, said this scenario is still far from what the technology is truly capable of.
The current trajectory of large language models – the systems behind chatbots like ChatGPT – “is to simply be coworking with us,” he said. “It won’t entirely automate jobs away.”
Rinehart emphasized the importance of staying informed about the technology and how it might affect the news industry from both inside and outside the newsroom.
“This is pushing us in a direction that some of us don’t like,” she said. “But if we don’t experiment together we’re going to end up on the other side of something that is unrecognizable.”