Skip to main content
IT Strategy

Here’s how federal law enforcement officials are actually using AI

Speakers at a GDIT conference laid out how the DOJ, FBI, and other federal agencies are already using AI.
article cover

Nes/Getty Images

4 min read

Federal law enforcement officials are already using AI tools, and they’re eager to pilot more, despite pesky challenges like admissibility, privacy, and sustainability.

That’s according to federal officials who spoke at General Dynamics Information Technology’s (GDIT) AI conference in DC on June 4, where they explained how agencies like the DOJ, the FBI, and the State Department approach implementing AI technologies.

While the federal government has introduced basic guidelines on AI usage, they basically boil down to following whatever policy was already in place in any given department. Melinda Rogers, chief information officer of the DOJ, told attendees that buzz around AI doesn’t translate into easy gains.

“I would caution chasing the shiny new object,” Rogers said. “I’ve always said the technology part is the easiest part.” Far more difficult, she told attendees, is creating processes for using that technology that people use effectively.

Justin Williams, the deputy assistant director of the FBI’s information management division, said the agency has to be careful when procuring products with AI or machine learning components, as black-box processes won’t stand up in court.

“Imagine if one of our agents goes into a court, and he’s asked, how did you come across this information? and we throw our hands up and go, ‘Well, I don’t know. We just typed it on the computer and spit it out,’” Williams told attendees.

Other considerations include privacy. The FBI’s Criminal Justice Information Services Division holds over 30 petabytes of data at any given time, Williams said, but access is limited even to FBI employees, let alone AI training models.

“There’s nothing like the desire to throw all that data into this massive data lake and see what comes out of it,” Rogers said. “That would be super cool and fun, but it would not be lawful.”

The FBI has already found some uses for AI, however. Cynthia Kaiser, the deputy assistant director of the FBI’s Cyber Division, told attendees the FBI tip line uses AI to review calls for anything a human might have missed.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

“We’re using natural language processing models to also go over the synopses of the text of what that phone call or online tip entailed,” Kaiser said.

While a human is always involved, “It helps us fill in the cracks,” Kaiser added. She said future AI use cases at the FBI might include summarizing interagency intelligence or subpoena returns.

Gharun Lacy, deputy assistant secretary for cyber and technology security at the Bureau of Diplomatic Security, said the State Department’s data, AI, and analytics teams have piloted using AI to summarize incoming intel.

It was successful in “reducing the noise in terms of threat intelligence, sifting through it, making it more actionable directly for us,” Lacy said.

Sarah Nur, associate CIO and CISO at the Treasury Department, told attendees generative AI has already enabled relatively unsophisticated threat actors to rapidly query new attack methods.

That will likely force the Treasury and financial institutions to adopt AI cybersecurity tools, Nur said: “We really have to look at leveraging AI to quickly detect these anomalies and any kind of fraud or unusual suspicious activity.”

Kathleen Noyes, FBI section chief in charge of Internet Governance and Standards, Lawful Access, and Patent Programs, said the agency’s evaluation processes for new technologies include a Shark Tank-style pitch meeting format.

David Miller, the FBI’s interim chief technology officer, told GDIT attendees that after years of developing technology model evaluations for AI, the FBI and its ethics council are now focused on evaluating AI at scale.

AI is “not a standalone capability,” Miller said. “It is a component of your existing solutions.”

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.