Last winter, the chattering classes were all chattering about this: the online language model that can churn out coherent essays in a matter of seconds. In November, ChatGPT was made available for free on the Open AI website, so anyone could try it out for themselves. I did so, and wrote about the experience for WORLD magazine (PDF here). The results were impressive, but since then, some flaws of this brainy tool have come to light—more about that below.

Like it or not, artificially-intelligent writing will be part of our future, and will undoubtedly replace entire swaths of the writing profession, such as journalism. Why pay for a reporter when you can collect information from a freelancer, feed the facts into a machine and watch it churn out a readable story for free? Especially when few reporters these days seem to wear out any shoe leather tracking down a story. Opinion writers echo each other frequently and tiresomely, so why not just program a progressive language bot and let it stir up words to fit the issue?

Especially since ChatGPT is programmed progressive already. A reporter for WORLD asked it to write an essay in praise of President Biden, and it obliged at once. An essay in praise of President Trump was beyond its pay grade, however. That, I would say from a neutral stance, is hogwash. And a flaw. The model has all the information on the web at its disposal and could easily find more rightwing opinions to draw from. Could ChatGPT be just another tool of the left to shut down alternate views?

But its handling of obvious facts isn’t completely reliable either. After my column appeared in WORLD, a reader wrote in with a correction. The first assignment I gave the machine was, “Explain the controversy Athanasius and Pelagius.” ChatGPT dived into early church history with aplomb and spun out a grammatical and historically accurate reply—except for one thing. The debate was between Augustine and Pelagius. Athanasius is best known for an earlier controversy with Arius over the divinity of Christ. I was chagrined by the reader’s comment. I knew it was Augustine and not Athanasius. I was just being sloppy. But so was the machine. I looked at the original interaction and yes, ChatGPT had assumed my mistake and run with it. It gets “smarter” in time as factual errors are corrected, but I’m puzzled how it could explain the controversy accurately without gently reminding me that I was mixing up my early church fathers.

The language model is entirely imitative; it learned to “talk” the way any child learns, by listening and responding. In time, little as I’d like to admit it, AI will probably be able to imitate human genius. Its current attempts at poetry and drama are lame, but who’s to say it can’t learn from the best? Will enough Emily Dickinson input produce Dickinsonian output? Or a blockbuster musical comparable to The Phantom of the Opera that will save big bucks on royalties? (The bottom line is the bottom line.)

Some doomsayers see all the professions taken over by machines, including the trades and crafts. Could happen, but only if we let it. ChatGPT can prove itself useful for routine business-related projects, but my pain problem with ChatGPT is that it short-circuits thinking skills. Though many high schools and colleges have already outlawed it, clever nerds will continue to find their way around such restrictions. And why not? Why not outsource hours of brain work (if one follows the essay-writing protocols in Wordsmith Craftsman, for example) to a machine that can churn out a B+ essay in minutes? That’s efficiency!

But it isn’t thinking, and there are no shortcuts to thinking. A machine “thinks” in linear fashion; a human can learn to think in ever-widening circles that lead to surprising juxtapositions and unforeseen conclusions. Besides straightforward academic questions, I asked ChatGPT to compose a college entrance essay, a marriage-proposal letter, an approach to my parents about my transitioning from female to male, and advice on how to talk to a spouse with dementia. All the answers were acceptable and reasonable and boilerplate—also a bit spooky. But not creative.

Literate societies create; nonliterate ones sustain; trivial ones fritter away hard-won gains until they’re sitting on the packed-earth floor. Let’s not outsource thinking. Let’s just not.

Janie Cheaney began homeschooling in 1985 and graduated her second and last child in 1996. She is the author of the Wordsmith creative writing series and has published six novels for children. Since 2008 she has written a regular column in WORLD magazine and comments regularly on WORLD radio. She and her husband live in Missouri.

For additional writing tips, exercises, and writing activities, subscribe to the bi-monthly Teachers’ Lounge newsletter at wordsmithseries.com.

Reprinted with permission from Janie B. Cheaney.