ArticlesWriting

AI and Journalism: Keeping God First

By Akosua Frempong

Last year, I attended a conference at the Museum of the Bible in Washington, D.C. The conference was about artificial intelligence and Bible translations.

Several speakers attended, talking about the need for accuracy through human supervision while speaking on the efficiency of AI tools for the sake of the Gospel: reaching the unreached through translating Bibles into low-resource languages (those that translators and developers have little or no data on). The aim? To reach these Christians by 2033.

So, for these experts, the answer to effective AI use was efficiency (speed) and accuracy. While these individuals weren’t journalists, what they said translates well into our journalism work.

AI and AI tools for journalists

According to Mc-Kinsey and Company, AI is “a machine’s ability to perform the cognitive functions we associate with human minds, such as perceiving, reasoning, learning, interacting with the environment, problem-solving, and even exercising creativity.” This includes generative AI, which is an AI model that generates content in response to a prompt or question.

In journalism, there are several AI tools that we can use to enhance speed and accuracy: Grammarly (for copy editing) and Otter (for transcription). But there is also PeopleAI (a chatbot, allowing journalists to research famous individuals worldwide). There are several more you can find for those who are interested in text-to-speech tools (like Murf) and drawing, art or image-generating tools (like Google Autodraw).

How news outlets and their journalists use AI

News media organizations (both mainstream and Christian) use these and other AI tools for researching, writing alternative texts for accessibility for the visually-impaired, finding coverage gaps and translating articles into Spanish and Portuguese to support their Latin American audiences. Reuters news agency, for example, uses AI voice translation tools for this latter purpose.

Other news organizations use AI tools to write headlines (and for search engine optimization so that their stories and companies can appear higher in search engine results) and story summaries, allowing readers to get a snapshot of their news articles at the beginning of the stories. Summarizing using AI is especially helpful for those who are busy and are not able to go through an entire article at a point, to get the key points of the article. Additionally, news organizations are using AI tools to curate and recommend articles of interest to their readers.

However, since AI use can present numerous ethical challenges (including bias, inaccuracies, public mistrust, and intellectual property infringement), several news organizations have policies in place (and organizations like the Poynter Institute have templates created) to ensure the responsible use of AI by staff and freelancers. For instance, Christian Daily International has a list of permissible and impermissible uses of AI. The organization, for example, writes that its journalists cannot use AI to write entire articles (news or opinion). On the other hand, those for whom English is a second language may use a tool like ChatGPT to improve their English expressions.

Ethical concerns and solutions for text

Clearly, using AI tools to write stories is an ethical trap. As Christians, we know that we’re made in the image of God (Genesis 1:26-28). For that reason, we have the creative and thinking abilities, as well as the training, to write compelling stories. As Yvonne Carlson, who serves on the NRB Board of Directors and the NRB Digital Media Committee, mentioned during an NRB discussion on the topic, the Holy Spirit can move through us when we do the writing ourselves.

Another concern of using AI tools is inaccuracy or what is called “digital hallucinations” or errors. We’ve all had times when Grammarly, for instance, suggested a correction that wasn’t accurate. Thankfully, we had the grammatical aptitude to determine that the suggestion wasn’t accurate. We’ve also used tools like Otter, which transcribed sources saying something they didn’t say. What about using ChatGPT to research an individual? You would find that the citations it provides for you to dig deeper aren’t related to the individual on whom you’re trying to research. These are all concerns with which journalists are grappling.

About inaccuracies, one of the ways to mitigate them is by verifying or fact-checking. Using multiple sources to verify that the information is correct is essential. Reaching the source directly—either personally or through the individual’s website—is also ideal.

Ethical concerns and solutions for images

The same applies to images. If you come across an image and you’re not sure about its origin, confirm it. Do a reverse image search using, for instance, the Google image search tool through the Google search engine. If the image exists, you’ll see it on the web, attributed to a source, which can help you attribute accordingly. If it isn’t available online, that’s your way of knowing it might have been AI-generated and doesn’t exist. In such cases, it’s better not to use the image since you wouldn’t be able to attribute it to a precise source.

Moreover, you should always find out (either from the news outlet’s website or directly from the editor with whom you’re working) what AI policies are in place. If there are none, you could tell the editor how you intend to use AI, based on what you know to be best practices and ethical uses by other media organizations, and ask if he or she is okay with that AI usage.

Looking ahead

Although God has blessed us with AI tools to make our work easier, we must be conscious of how we use them. We must ensure we’re glorifying God through how we use AI tools. This means that we must make sure we’re doing what God has gifted us to do (writing, researching for depth, verifying, double-checking for grammatical accuracy, interviewing and photographing), while using AI tools to assist us in doing those (or related) tasks efficiently to meet tight deadlines.

As we look forward to the future with AI tools, let’s remember this: While using technology is essential, we should always keep God’s Word—yes—first.


Akosua Frempong, Ph.D., is an EPA freelance journalist and founder of Listening Ear Communications. She’s also an adjunct journalism professor at Regent University, Virginia. Additionally, she’s a trainer for Magazine Training International, including training Christian industry professionals on AI and journalism. She’s a previous Jerry Jenkins Scholarship recipient. Learn more at listeningearcommunications.com.

Posted December 4, 2025

Related Articles

Back to top button