AI is Definitely Coming to a Newsroom Near You—But in What Form?

-first published in Substack blog “The “Last Editor” on February 18, 2026

My semi-retirement duties as a professor emeritus at the Missouri School of Journalism include flying from my California home once or twice a semester back to Columbia to play host to recruiters from the big local television station ownership groups as they come to campus to interview our students. That means I must coordinate appointments for fifty or more students, plotting their multiple available times with the two or three recruiters’ openings for interviews. I generate some spreadsheets to give me all the options and then I must painstakingly try to match everyone so that all the students get a chance to interview and all the recruiters get to see everyone they want to see. I have a system I have honed to perfection over the past few years, but it still takes many hours to set up each visit.

A couple of years ago, an administrative assistant suggested I try using AI to build the schedule. I’ll admit—then and now—that I’m not much of an AI user. Don’t get me wrong—I have no aversion to new technology. I just really don’t have a reason to have AI do what I already know I can do myself. Still, this scheduling task is so burdensome, I figured why not give it a try. After several aborted attempts at writing a prompt telling the AI how to deal with all the data, I thought I finally had it just right. I plugged in all the parameters and, in just a few seconds, got a neat and beautiful schedule all ready to go. The journalist in me wanted to double check everything, so I started comparing the raw data I fed in against what the AI had given me. It didn’t take long to see that large numbers of students were left off the schedule, while others were given repeated appearances with the same recruiters. Part of me understood this was due to some inadequacy in my prompt. But I also could tell that the programmers of the AI built it in such a way that it didn’t really matter how many errors it made, just as long as it turned out something that looked like what I wanted. I quickly went back to my tried-and-true method and did the schedule by hand.

Flash forward to this year as a prepared for my first campus visit of 2026 and figured maybe it was time to try it again.

After all, AI improves all the time and now maybe it will work better without all the errors. I came up with what I felt was a good prompt, loaded in the data and awaited results. Something felt different about the way the AI was responding as it worked, telling me what it was doing each step and promising results in a certain number of seconds. When they finally came—they were worse than what I had tried two years ago. Not only were people left out of the mix, but it just sort of gave up before finishing the entire schedule. My perception was that the AI put in even less effort to do my task than it had when I tried this before, and cared less about doing what I wanted. I know I’m personifying the AI by saying it “cared” about something, but these results had all the trappings of an assignment turned in by a student who just doesn’t care about the work or what grade he gets.

Simmering in this disappointment, I then stumbled upon what the editor of Cleveland.com (the web site of the Cleveland Plain Dealer) thinks is a good use of AI. The headline caught my eye at first: “Journalism schools are teaching fear of the future,” it read. I beg your pardon, I thought, deciding to read further. But rather than a piece generally lamenting schools being old-fashioned and not teaching up to date skills and practices, it got very specific. The editor, Chris Quinn, is a guy about my age who went to Temple University, a fine journalism school in its own right (if that was his program there). In his editorial, Quinn recounts his frustration with a candidate for a reporting job who dropped out of contention after finding out how the newsroom uses AI. Quinn belittles the person and the journalism professors who had taught this candidate that, to quote Quinn, “AI is bad.” First of all, no professor stands in front of class and says, “AI is bad,” unless also going into great detail about any number of reasons why AI should not be an unfettered tool we use to create our stories. So I’m already rolling my eyes a bit at this oversimplification of what is a very important topic schools are addressing in their classes every semester.

I read on, fearing the worst.

Quinn goes on to explain exactly how his newsroom is using AI. He starts by saying his team is using AI to identify stories in some outlying counties in the newsroom’s coverage area. I’m fine with that. It’s hard to put people on the ground in distant cities and besides, lots of newsroom are successfully using AI to sift through the mountains of data we encounter every day, highlighting items that might be important and worth human attention. OK then, so far, so good. I’m even pleased to read that Quinn is considering expanding to more distant counties using this tool.

I continue to read, thinking perhaps Quinn is onto something good here—though still not sure why he was picking on journalism schools in his headline. Why would the candidate drop out of contention for a job when AI was just helping find stories? Then I read the next paragraph about how his reporters work:

Because we want reporters gathering information, these jobs are 100 percent reporting. We have an AI rewrite specialist who turns their material into drafts.

Wait, what? AI (and don’t call it a “specialist” to make AI sound like it’s a person) is writing the reporters’ stories for them? Quinn qualifies it by saying humans supervise the final drafts, fact-checking and editing them, but the damage is done. Now I can see why the candidate dropped out. Reporting is not just the act of gathering facts. If it were, we’d just publish the pages right out of reporters’ notebooks and call it a day. Writing is an integral part of the reporting process. Not only is writing necessary to put all the facts we gather into a form audiences can easily digest, but the concept of what form the story will take starts even before we leave the newsroom to report. We build a structure for our writing with the audience in mind, planning lead sentences that will engage the audience to stay for more, including characters about whom the audience will care and weaving it all together in a way that protects and preserves the English language for people who hardly ever consume carefully written or spoken prose anywhere else.

I understand the financial burdens facing journalism these days, but reading this was nothing less than chilling.

Quinn doubles down on his efficiency argument for using AI to write his newsroom’s stories, saying “By removing writing from reporters’ workloads, we’ve effectively freed up an extra workday for them each week.” An extra workday to do what, exactly? Quinn does have the right answer here, saying his journalists are using it to put more time on the street having coffee with sources and doing interviews. At least the AI isn’t doing the interviews—yet. I can’t argue against the notion that reporters with more time on the street is a good thing. But I’m still uneasy about those same reporters not being able to put their own words down on paper to speak directly to the audience about what they’ve seen and heard.

This experiment in Cleveland seems to want things both ways. Quinn argues that the candidate who dropped out of contention for the job was foolish to do so because the job market in journalism is so bad. He cites statistics about layoffs in the newspaper industry and how little chance someone coming right out of school (as this candidate is) would have at getting a job these days. Yet he seems to fail to recognize that many of the job losses he cites are the result of media owners looking to have human workers do less and automation—including AI—do more.

The real shot at journalism education was still to come.

Quinn says the candidate who rejected his AI experiment wasn’t to blame, but that person’s journalism professors are the real culprits. Quinn calls journalism programs decades behind, leaving graduating students with unrealistic expectations they will all be “long-form magazine storytellers, chasing a romanticized version of journalism that largely never existed.” I know we do just the opposite at Mizzou, letting our students know how tough it is to find a job and giving them realistic expectations about the very basic journalism jobs they will be able to get coming out of school. Our students know those dream jobs—magazine storyteller or whatever—will come much later in their careers, if ever at all. Every professor I know at any other journalism school teaches the same thing.

I’m not planning to use AI to do my schedule for students to meet recruiters at Mizzou anytime soon. But I don’t reject using AI to do the sort of mindless work I was trying to get it to do for me. It can help newsrooms figure out what their reporters will cover and then step back and let the human reporters do the writing. I still fear AI’s sloppiness and lack of concern for accuracy or detail is a dealbreaker for giving it any task deemed to be very important. It will get better, of course, and may actually reach some level of acceptable accuracy at some point. Until then, we can trust it to lift some of the more tedious work from our shoulders, but never to take over what a reporter with a good head on those shoulders can accomplish in an honest day’s work.

Previous
Previous

A Twist of the Wrist is the Gist of How We’ll Exist

Next
Next

Guthrie Case Has Everything TV News Wants—And Nothing It Needs