The Associated Press today released guidance on how it uses generative artificial intelligence tools like ChatGPT and will update its AP Stylebook to reflect a new era for newsrooms.
“Accuracy, fairness and speed are the guiding values for AP’s news report, and we believe the mindful use of artificial intelligence can serve these values and over time improve how we work,” said Amanda Barrett, AP vice president for standards and inclusion, in a news release.
The internal guidelines, which stress the importance of human editing, warn about the myriad pitfalls of generative AI: its tendency to “hallucinate” and produce misinformation, the ease at which bad actors can produce disinformation and privacy issues concerning what users put into ChatGPT.
“AP has a licensing agreement with OpenAI, the maker of ChatGPT, and while AP staff may experiment with ChatGPT with caution, they do not use it to create publishable content,” according to the standards. “Any output from a generative AI tool should be treated as unvetted source material. AP staff must apply their editorial judgment and AP’s sourcing standards when considering any information for publication.”
For example, an AP reporter may use ChatGPT to generate an article summary, but it must be edited like any other content before publication, Barrett said in a phone interview.
“I want to emphasize that this is a tool we can use, but does not replace the journalistic smarts, experience, expertise and ability to do our jobs in a way that connects with audiences,” she said.
The AP guidance lays out a framework for responsible AI use, including the following suggestions:
- In accordance with our standards, we do not alter any elements of our photos, video or audio. Therefore, we do not allow the use of generative AI to add or subtract any elements.
- We will refrain from transmitting any AI-generated images that are suspected or proven to be false depictions of reality. However, if an AI-generated illustration or work of art is the subject of a news story, it may be used as long as it (is) clearly labeled as such in the caption.
- (J)ournalists should exercise the same caution and skepticism they would normally, including trying to identify the source of the original content, doing a reverse image search to help verify an image’s origin, and checking for reports with similar content from trusted media.
“You still have to be a journalist,” Barrett said.
The AP has been using AI since at least 2014, and has spearheaded efforts to help other newsrooms experiment. But the development of these standards — which started before the AP’s deal with OpenAI — comes as reporters, editors and publishers grapple with how and why to use AI tools.
“It is a really interesting time in journalism,” Barrett said. “To me, it almost reminds me of the rise of the internet.”
ARLNow — a local news site that focuses on Arlington, Virginia — and a group of related sites operated by Local News Now have been using AI to generate social posts, summarize press releases and are planning a morning newsletter. Scott Brodbeck, CEO of Local News Now, said he’s trying not to get bogged down in thinking too much about new standards or guidance and instead is focused on application.
“I’m less concerned with the ethics behind it than I am with the reader experience — you have to start with reader experience and then go from there,” Brodbeck said. “I think once we get over the newness of this, some of the hand-wringing ethical questions will probably fall away.”
He said, however, that Local News Now’s work with AI would not run afoul of the AP’s new standards.
Other organizations are also working on guidance for newsrooms looking to use generative AI. The Partnership on AI, a nonprofit aimed at addressing the impact of AI across industries, is seeking feedback on a draft of its AI Procurement and Use Guidebook for Newsrooms. At 27 pages, the document delves much deeper into the use of AI in the newsroom.
Developed with input from experts at Gannett, McClatchy and the AP, the partnership’s guidebook includes questions to ask developers before investing in their AI tools, suggestions for metrics to track the efficacy of newsroom AI use, and puts forward a framework that breaks down AI tools into categories.
“This may seem like a lot of work upfront for what otherwise might be a simple process,” according to the guidebook. “Answering these questions at the outset, however, will save your newsroom a lot of time and energy compared to retroactively figuring out responsible use of a tool after purchasing it, training it, and using it.”
Barrett said the AP Stylebook’s updated guidance for covering the burgeoning AI industry is as important as the new internal standards.
“For example, journalists should beware of far-fetched claims from AI developers describing their tools, avoid quoting company representatives about the power of their technology without providing a check on their assertions, and avoid focusing entirely on far-off futures over current-day concerns about the tools,” the new guidance reads.
“One of the challenges that is going to be present for us and everybody else is the landscape is going to be changing very quickly,” Barrett said.
The AP doesn’t yet have metrics in place to measure how well its generative AI-assisted reporting performs. But a group of staffers will meet every month to assess the latest changes in AI technology and how it might shape standards.
“I think it’s ultimately a positive thing that people are thinking about policies and ethics, but I think what should come first is to look at how this benefits readers and our newsroom,” Brodbeck said. “Is AI actually making things better? If it is, then you do it, if it isn’t, then don’t.”
Clarification, Aug. 16, 2023: The new AP Stylebook guidelines will be available Aug. 17.