March 13, 2024

In 2001, journalism was facing a “crisis of conscience, confidence and purpose.” That’s how Tom Rosenstiel and Bill Kovach described the media environment in “The Elements of Journalism,” a book that came out of a series of meetings around the country hosted by the Project for Excellence in Journalism, a project I had the opportunity to help launch with Tom and Bill.

Two decades later, we are facing another, I would argue much more threatening, crisis in journalism. This one requires that we — as a global society — rearticulate these principles for a digital, AI-incorporated news environment. Technology is now fully embedded into journalistic reporting, delivery and consumption; most news is mixed in with an endless array of other content, whether social updates, celebrity hot-takes or upcoming video releases; and now generative artificial intelligence is propelling technology deeper into the media industry (even as AI more broadly has long been a tool within newsrooms).

Yes, figuring out a business model for the future is essential, including opportunities for digital revenue. But for any of that to be successful, journalists must ensure that developments in AI and AI policy strengthen journalism, both in newsgathering and in serving that news to the public in a way that distinguishes it from other content.

We are having an increasing number of policy discussions related to news at the same time it is getting harder to put solid borders around what “news” is and when we are witnessing some of the lowest levels of trust, brand awareness and general appreciation for news providers. Add to that the fact that press freedoms are facing increasing threats around the world — even in democracies.

Just as editor and publisher Horace Greeley took the initiative to separate news from opinion, news leaders need to take charge today in thinking through how to apply the principles of verification, authenticity and transparency to distinguish uses of AI that aid news reporting and help inform the public from those that work against it.

We as a global society have not updated these principles for the digital era. We’ve done some work in digital news literacy, made attempts at content moderation, and built new kinds of connections with audiences, but news deserts continue to grow and misinformation continues to spread. We really need to consider the role of journalism today as we discuss laws that will govern its future.

How does verification work today? What kinds and levels of authenticity are most valuable for the public? And what new transparencies do journalists need from technology companies to do their reporting responsibly and in a way that serves the public?

Journalists (and I would argue local journalists in particular) should be leaders in determining these answers — across the full process of technology inputs, queries and outputs. This means:

  1. Developing ways to take advantage of these tools in the reporting and production of important issues and events, and to best explain the use of those tools in their published works
  2. Working with technologists to be sure that AI models — as well as other types of algorithmic work — have structures in place to enable these principles
  3. Teaching the public how to apply these principles themselves to differentiate the content they come across, which also builds trust and value
  4. Leading discussions on the ways policy structures can help secure them — or risk threatening them.

Much attention is being placed on one area of transparency: provenance (the trail of a piece of content over time) and origin, with labels such as watermarks aimed at helping to identify ill-intended changes to content. These kinds of tools are important but not enough. So much of what determines value — and will do so even more as these technologies evolve — is how AI was used in a piece of work. Was it something to serve the public? Help communicate a story out more quickly or to a more diverse audience? Studies find that current public-facing labeling approaches fall short. General labels about false information often lead people to discount factual news, and an “implied truth” effect has also emerged for false information that is not tagged. These findings suggest techniques to disclose provenance and origin, which remain concepts the public is not fully aware of, need to be comprehensive to yield effective outcomes.

This kind of further explanation should be embedded into other uses of AI as well, especially those that serve the public. Schibsted, a Norwegian technology and media company, for example, is exploring ways to build a Norwegian large language model that can be used by other publishers in the country. And Aimee Rinehart, as a part of her Tow-Knight Fellowship on AI Studies, is building an interface for the news industry to access the most updated large language models.

Tech companies need to play an important role here, too: They must build trust among journalists by respecting the unique areas of work journalists provide, create transparency in the models and other tools so reporters know they can trust the output and pass along that transparency to news consumers, and ensure the tools are widely accessible.

Next then comes the important question of what government regulations would best enable the benefits while guarding against harms. Getting this right requires journalists, technologists, researchers and policy experts to work together to develop solutions that safeguard an independent press, protect civil rights, involve regular review of data and trends, and continuously advance with the developments — all while staying true to journalistic principles. This — working to define journalism today across the full breadth of issues facing our global, digital news environment — is the mission of CNTI.

Even as we work to put up necessary guardrails against potential harms, we need to have forward-focused discussions that build a future where journalism — including journalism that takes advantage of technology — is distinguishable, safeguarded and valued by the public.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Amy Mitchell is the founding executive director of the Center for News, Technology & Innovation, an independent global policy research center that seeks to encourage…
Amy Mitchell

More News

Back to News