I love seeing how technology advances and how we, as a human race, incorporate the advancements into our work.
I don’t like seeing how newspapers have been incorporating artificial intelligence into their newsrooms, though.
I’ve seen more and more publications using AI in place of talented writers, usually for articles with the main purpose of drawing in clicks and views. The A.V. Club has been publishing articles using “The A.V. Club Bot,” which uses information from IMDb combined with AI to create listicles — not a surprising move from the publication either, which has been going downhill for years. More local to Ohio, the Columbus Dispatch used a program called LedeAI to write high school sports recaps in August.
Seeing these AI-generated articles depresses me. I’d much rather see a seasoned journalist turn these stories into something much more interesting. Still, it’s worth a conversation figuring out if AI has a place in the newsroom.
Does AI even think it’s qualified?
If AI is starting to take the same jobs as journalists, it’s only fair that it gets questioned in the same way a candidate for a job would be.
I asked ChatGPT-3.5 what its qualifications are to be a journalist. Here’s what it had to say.
My first question was “What role does AI have in journalism and newsrooms?” ChatGPT gave me a long list of duties, and I agreed with some of them like “predictive analytics” where AI can be used to predict trending topics. The AI was wrong, however, saying it could be a fact-checker — these programs are not reliable for fact-checking.
When I asked for its journalism background, ChatGPT responded, “I don't have a personal background or experiences, as I am a computer program created by OpenAI. However, I have been trained on a diverse dataset that includes information on various topics, including journalism.”
At first, this lack of personal experiences seems fitting for the journalist role, allowing the writer to stay neutral. Yet, in my experience, journalism necessitates a personal background as a field that largely covers the human experience. Plus, AI takes information from what humans have already written, so it’s incorporating the human experiences of many, sometimes also incorporating biases.
I asked the chatbot for its weaknesses in reporting, and it gave me a list of 10, which included a lack of current information, no personal experience or emotional understanding, lack of creativity, no way of using ethics and inability to verify information.
Where the AI’s answers did stand out was when I asked it for the most important tenets of journalism. It listed answers like accuracy, impartiality, independence, fairness and accountability, all of which I applauded.
Enjoy what you're reading?
Signup for our newsletter
What are some of the problems with using AI to do journalism?
It’s not difficult to see some of the problems with using AI to write articles. The most immediate problem is that AI simply can’t be present in places the same way a journalist can. Part of the commitment to accuracy is firsthand knowledge.
During the summer, I heard leaders from Gannett — which owns The Columbus Dispatch — discuss other problems with AI-generated stories. Because AI learns from already created work, it can also sometimes incorporate that work into its writing without saying so. This one’s a major concern as it can lead to lawsuits over plagiarism.
AI writing just isn’t very interesting either. Like ChatGPT said, it has no creativity and creates bland articles that show no effort in reporting.
“Content farms are nothing new; media outlets were publishing trash long before the arrival of ChatGPT,” Samantha Floreani wrote in The Guardian. “What has changed is the speed, scale and spread of this chaff.”
Even though AI can add more articles to a publication in hopes of increasing engagement, is it really worth it if it will make the site look bad?
We must protect the journalists who spend their lives doing great reporting. Don’t let their bylines become lost and reporting be taken by programs like Google’s AI-generated articles. Don’t let them slowly take our jobs — we want to write, and we can do a better job.
What is AI’s place in the newsroom?
Whether you like it or not, AI probably isn’t going anywhere. It’s been creeping its way into our lives for a while with technology like Siri and Alexa.
Our capitalist society isn’t going to let people just ignore AI, but if journalists can learn to use it properly and effectively, it can make their lives easier without threatening their jobs. When talking about AI this semester, one of my professors used the common saying, “Keep your friends close and your enemies closer,” which certainly applies here.
Because journalists must cover such a wide range of topics, they need to know how to do many tasks, without being able to be trained on everything.
AI can be the perfect helper/trainer, doing the tasks a journalist might not know how to do. I’ve used it in ways such as helping me figure out how to format spreadsheets when I need to compile lots of data. It’s a great time-saver, leaving more resources to be used on actual reporting.
AI shouldn’t go much further in the newsroom than being an assistant, and that’s why, so long as I’m here, you’ll never see a genuine news article written by AI at The Miami Student.
We value our staff and their writing and reporting and to do so would undermine their commitment to journalism and destroy the efforts made over nearly two centuries of experience here at The Student.
Luke Macy is the managing editor for The Miami Student. He is studying journalism, film studies and American studies at Miami University. His work can be found in multiple Ohio publications, and he’s received various awards, including best investigative reporting in Ohio.