Artificial Intelligence (AI) is generating both change and fear in local newsrooms, and that’s one reason why the Associated Press is studying and supporting AI development in journalism.
Ernest Kung, the AI product manager for The Associated Press, helps local newsrooms implement artificial intelligence into their workflows. He said he has noticed a dramatic increase in the interest in AI among local newsrooms.
“Some of them have been very enthusiastic about learning about AI; some have been not so enthusiastic; some have been very wary,” Kung said, noting that the wariness generally comes from a fear that AI may potentially take their jobs.
Kung said AP is aware of that fear.
“There is no substitute, and this is the position that AP has, there is no substitute for on-the-ground reporting. That is, that’s what we tell everybody, and we stand by it. There are situations where we see machines, algorithms being able to help us out, help us do our jobs more efficiently.”
Kung suggests those interested in using AI to make journalists’ jobs easier should identify a challenge and then figure out what skills, technologies, and other resources are needed to address it.
“The skills will depend highly on what type of tool they are going to adopt. But then, that itself is a skill, to be able to know what you need. Then, the next step is identifying a potential technological solution that can address that gap.”
Kung breaks it down into a three-step process:
- Identify the problem, challenge, or inefficiency.
- Research the potential technological solutions.
- Determine the necessary skills and other resources needed for implementation.
In some cases, implementation will require skills that a newsroom just doesn’t have in-house. AP is currently working with five newsrooms on generative AI projects, including the Brainerd Dispatch, El Vocero de Puerto Rico, KSAT-TV, WFMZ-TV, and WUOM-FM.
“Four of them require us to be able to have a developer who understands, among other things, coding in Python. They need to know how to pull code from GitHub and run it and set up the infrastructure to run it on their own for their own purposes. So, the technical requirements really depend on the tools,” Kung said.
Regardless of how complex the AI solution is, Kung says implementing projects takes real effort.
“We actually went through and systematically evaluated all the tools based on what they actually purported to do and then tested them out to figure out if they could accomplish what we needed to do.”
Once the tool is selected and implemented, there is one more key ethical consideration, according to Kung – making sure that anything the technology generates is reviewed by a human before automatically reaching the audience. “If a tool is unable to follow the rules that we have established for ethics, for example, if it accidentally reveals the sources of our interviews, when those interviews were meant to be confidential, that would be pretty bad.”
Kung said the nuances of journalism practices are so complex that using a tool to do every aspect of the job would likely result in something important being missed, with the potential for many negative repercussions.
“When you are developing Al, probably the biggest one of the many ethics concerns, the big one is [develop] a tool that does not attempt to replace humans.”