Law Enforcement Must Adapt as Criminals Weaponize AI
‘Use what criminals use for bad, for good.’
Patricia Blume
WASHINGTON, July 17, 2025 – Criminals are already using AI. Law enforcement may need to do the same to fight back.
Chairman of the House Judiciary Subcommittee on Crime and Federal Government Surveillance Rep. Andy Biggs, R-Ariz., opened the hearing Wednesday on artificial intelligence and its use in crime by calling it “the first of its kind,” noting that Congress had never before held a hearing on this topic.
Witnesses warned that AI-powered crime is already widespread. Generative AI is being used by terrorists to produce propaganda and by fraudsters to generate fake content.
Zara Perumal, co-founder of Overwatch Data, described how criminals today are asking a chat box how to commit a crime, generating synthetic fakes, and using AI to tailor attacks to specific victims through personalized videos and images.
Ari Redbord, Global Head of Policy at TRM Labs, agreed, pointing out that criminals are often among the earliest adopters of new technology. He cited a 456 percent rise in AI-driven scams and called it a national security threat.
Both Perumal and Redbord emphasized that law enforcement must adapt to the technology to fight against the cyber attacks, stating that every federal agent should have access to tools.
“The solution is not to ban AI but to use it and use it wisely,” Redbord said. “The future of crime will be defined by AI, but also the future of law enforcement.”
But not everyone on the panel agreed on how far that access should go.
Ranking Member Rep. Lucy McBath, D-Ga., urged caution. She pointed to new facial recognition rules adopted in Detroit, which prevented arrests based solely on facial recognition.
“Many cities and states have put sensible guardrails in place to limit potentially harmful uses of AI,” McBath said.” That's why it was alarming when some of my Republican colleagues recently attempted to pass a moratorium on state and local AI regulations in the big of the bill, a move that generated bipartisan opposition, so much that 40 states attorneys and 17 Republican governors, including the governor of my state of Georgia, wrote letters to stand in opposition.”
Cody Venzke, ACLU Senior Policy Counsel, supported Mcbath’s position. He raised concerns about facial recognition technology, stating that it has misidentified Black people, leading to unjustified arrests.
“We engage in civil rights allegations to defend individuals who have been identified by facial recognition,” Venzke said. “We will always have a similar background process, ensuring that police departments have appropriate processes in place. That there isn't relying solely on identification.”
Former Counsel, Department of the Air Force Artificial Intelligence Accelerator at the Massachusetts Institute of Technology Andrew Bowne agreed that using AI with law enforcement depends heavily on the jurisdiction.
“Using AI in law enforcement depends on the jurisdiction that’s using it,” Bowne said.
To conclude the hearing, Biggs asked the panel how far off we are from autonomous criminal behavior.
All the witnesses agreed: that day has already arrived.
“Using AI to do crime–we are certainly there,” Bowne said.
Redbord added a slight caveat, saying, "Although it is happening today, it is not yet dominant. But we are getting close.”

Member discussion