With journalists at The Washington Post, BuzzFeed and The Associated Press using bots to automate reporting, there remains some confusion about who gets credit for a story produced by a bot.
In a recent paper, “I, Robot. You, Journalist. Who is the Author?” researchers Tal Montal and Zvi Reich analyzed 174 automated stories from 12 websites and interviewed seven journalists working for those organizations.
Upon reviewing the content of the pieces, the bylines, the code patterns, newsrooms’ full disclosure policies and legal framework around using “robot journalists,” the study concluded that there are discrepancies in how editors attribute automated stories.
Most interviewees suggested that the story should be credited to the programmers of the bots in order to maintain transparency. But the analysis of the articles suggested otherwise: some pieces had no byline, some were credited to a human author and others were attributed to the newsroom in general. Some organizations, such as The Associated Press and Forbes, were able to achieve “full transparency,” according to the researchers.
So, what’s the right way to credit a bot? Below is a question-and-answer session with Kelly McBride, vice president of The Poynter Institute and its media ethicist, on the best ways to be transparent while publishing automated journalism.
As mentioned in the study, there’s some confusion about who gets credit for a piece produced by a bot. What is your take on this?
It doesn’t make sense to give the programmer a byline. We put bylines on stories as an accountability measure. So if a reader has a question, they know where to start. The reader isn’t going to want to talk to the coder, if she has questions. She’s going to want to talk to someone with editorial authority. Also, the computer code will be tweaked and altered to accommodate changing circumstances. So the individual coder won’t stay the same. Instead, the article should simply have a byline or tagline that indicates how it was created and who to contact if you have questions.
How does a publication maintain “full transparency” while publishing a story written by a bot?
Just tell the audience when bots are involved and where they can turn for more information.
The authors of the research called for the disclosure of the algorithm’s origin, the source of the data used in the story and the inclusion of a notice about the story’s algorithmic generation. What would your style of a byline for a bot-produced story be?
Maybe something like: “Written by a computer program, questions: sectioneditor@newsroom.com” or “This story was automatically generated by an algorithm. Please direct questions to Jane Editor.”
How do bot stories enhance reporting, in general?
They’re not meant to compensate for the nuances of a human reporter. Instead, bots mostly create stories that would otherwise be ignored, like sports stories for small colleges or Little League games. Bots aren’t usually used for the most significant stories. They work best on mundane or routine stories that don’t require much creativity.