DEV Community

Cover image for AI in Journalism
Peter Harrison
Peter Harrison

Posted on

AI in Journalism

I've been running an experiment. I wanted to see if AI could generate opinion articles that while written by AI capture my personality and perspectives. My AI Daily News site was initially just a way for me to aggregate news stories about AI into something I could digest in the morning before I started work.

Later I thought I would provide it a range of my prior writing, and to get it to prepare a 'Opinion' with my name on the byline. Would it produce something plausibly by me, presenting my views, but on the news of the day?

Sadly I think there has been a fundamental change from the early days of OpenAI models where the results were creative, unpredictable and entertaining. Now they have been trained in such a way to produce the same bland writing style regardless of the instructions you provide.

I got into the habit of waking up each morning, reading the 'opinion' and coaching Claude to rework it every day. Why? Because it would write opinions which were conflicting with my own documented views in fundamental ways. It would include the terms I have used, but not internalized the concepts. So each day I would need to correct it.

Multiple news organisations have banned the use of AI in journalism, and now I have experience of why. It isn't just opinions however; the stories it writes also have opinions injected beyond the facts. At least with the stories I have always linked to the original source material at the bottom, which for most stories is at least two stories.

I am not doing journalism by any measure. Journalism means doing the research, doing the interviews, cross referencing, and creating a cohesive angle for the article. Journalism isn't unbiased, in that it is influenced by the point of view of the writer, but journalistic integrity still means something.

Does this mean therefore that AI can't play a part in journalism?
My experience with AI in software has parallels. AI will happily generate code which passes 'tests' by altering the tests; in effect changing the conditions of success to please the user. AI does work in software development, but only when you have a framework which prevents this kind of gaming.

People have lost trust in journalism, partially a result of AI Slop, the kind of text which doesn't differentiate between fact and fantasy. There is also the angst of journalists fearing for their jobs resisting and minimizing the utility of AI. There is a temptation to cite ethics as a reason not to use AI when the real motivation is fear of being replaced.

The answer I think will be to employ the same disciplines that apply to human journalists to AI. That is, checking facts, resisting the temptation to opine, while at the same time creating compelling, entertaining and informing articles.

In my software development AI has become a partner, but not a replacement. It still needs me to apply that discipline to get good results. Just like software, journalism could benefit from AI, but only with stringent disciplines around how it functions.

AI journalism needs to be more than just a way of ripping off the work of actual journalists, rather to engage with the real world, and to be held to the same standards in terms of accuracy. The issue of how AI will impact jobs is a larger issue, but should not be confused with the utility of AI.

Top comments (2)

Collapse
 
moopet profile image
Ben Sinclair

There is no "AI journalism". There's summarising, paraphrasing, and plagiarising of other people's journalism. That's it. There's no new information being presented, there's no consciousness to have an opinion. It's a nothing sandwich. It doesn't add anything to the world. All it does it takes up resources.

It's not even about ethics, it's just about pointlessness.

Collapse
 
cheetah100 profile image
Peter Harrison

Lots to unpack here. Journalism is primarily using information from others, usually in the form of quotes. It is not about creating new information but finding an angle and presenting a coherent story from a foundation of facts provided by others. It is true that the current AI approaches are repackaging the work of others, but that is more about how AI has been utilized. It is able to pull together multiple sources into a coherent story with unique points of view. One of the issues with journalism has been a reduction in the quality, where you could get something printed just with a press release. Journalists were under time pressure long before AI, being forced to aim for clicks. And frankly they stopped bothering about whether what they reported was true. In some respects I think we have been a push back against partisian post truth reporting, but AI has made it more difficult by enabling bots to interact online, and generating 'content' which is carelessly generated slop. The point I'm trying to make is that we could use AI not to generate this slop content, but actually build systems which care about accuracy and factuality, and care about the quality of their outputs. I grant that a vast majority of AI slop content right now is pointless, only that this is an abuse of the technology.