AI police tool is designed to avoid accountability, watchdog says.
On Thursday, a digital rights group, the Electronic Frontier Foundation, published an expansive investigation into AI-generated police reports that the group alleged are, by design, nearly impossible to audit and could make it easier for cops to lie under oath.
Axon's Draft One debuted last summer at a police department in Colorado, instantly raising questions about the feared negative impacts of AI-written police reports on the criminal justice system. The tool relies on a ChatGPT variant to generate police reports based on body camera audio, which cops are then supposed to edit to correct any mistakes, assess the AI outputs for biases, or add key context.
But the EFF found that the tech "seems designed to stymie any attempts at auditing, transparency, and accountability." Cops don't have to disclose when AI is used in every department, and Draft One does not save drafts or retain a record showing which parts of reports are AI-generated. Departments also don't retain different versions of drafts, making it difficult to assess how one version of an AI report might compare to another to help the public determine if the technology is "junk," the EFF said. That raises the question, the EFF suggested, "Why wouldn't an agency want to maintain a record that can establish the technology’s accuracy?"
It's currently hard to know if cops are editing the reports or "reflexively rubber-stamping the drafts to move on as quickly as possible," the EFF said. That's particularly troubling, the EFF noted, since Axon disclosed to at least one police department that "there has already been an occasion when engineers discovered a bug that allowed officers on at least three occasions to circumvent the 'guardrails' that supposedly deter officers from submitting AI-generated reports without reading them first."
The AI tool could also possibly be "overstepping in its interpretation of the audio," possibly misinterpreting slang or adding context that never happened.
A "major concern," the EFF said, is that the AI reports can give cops a "smokescreen," perhaps even allowing them to dodge consequences for lying on the stand by blaming the AI tool for any "biased language, inaccuracies, misinterpretations, or lies" in their reports.
"There’s no record showing whether the culprit was the officer or the AI," the EFF said. "This makes it extremely difficult if not impossible to assess how the system affects justice outcomes over time."
According to the EFF, Draft One "seems deliberately designed to avoid audits that could provide any accountability to the public." In one video from a roundtable discussion the EFF reviewed, an Axon senior principal product manager for generative AI touted Draft One's disappearing drafts as a feature, explaining, "we don’t store the original draft and that’s by design and that’s really because the last thing we want to do is create more disclosure headaches for our customers and our attorney’s offices."
The EFF interpreted this to mean that "the last thing" that Axon wants "is for cops to have to provide that data to anyone (say, a judge, defense attorney or civil liberties non-profit)."
"To serve and protect the public interest, the AI output must be continually and aggressively evaluated whenever and wherever it's used," the EFF said. "But Axon has intentionally made this difficult."
The EFF is calling for a nationwide effort to monitor AI-generated police reports expected to be increasingly deployed in many cities over the next few years and published a guide to help journalists and others submit records requests to monitor police use in their area. But "unfortunately, obtaining these records isn't easy," the EFF's investigation confirmed. "In many cases, it's straight-up impossible."
Axon did not respond to the EFF's requests for information or Ars' request to comment.
“Police should not be using AI”
Expecting Axon's tool would likely spread fast—marketed as a supposedly time-saving add-on service to police departments that already rely on Axon for tasers and body cameras—EFF's senior policy analyst Matthew Guariglia told Ars that the EFF quickly formed a plan to track adoption of the new technology.
Over the spring, the EFF sent public records requests to dozens of police departments believed to be using Draft One. To craft the requests, they also reviewed Axon user manuals and other materials.
In a press release, the EFF confirmed that the investigation "found the product offers meager oversight features," including a practically useless "audit log" function that seems contradictory to police norms surrounding data retention.
Perhaps most glaringly, Axon's tool doesn't allow departments to "export a list of all police officers who have used Draft One," the EFF noted, or even "export a list of all reports created by Draft One, unless the department has customized its process." Instead, Axon only allows exports of basic logs showing actions taken on a particular report or an individual user's basic activity in the system, like logins and uploads. That makes it "near impossible to do even the most basic statistical analysis: how many officers are using the technology and how often," the EFF said.
Any effort to crunch the numbers would be time-intensive, the EFF found. In some departments, it's possible to look up individual cops' records to determine when they used Draft One, but that "could mean combing through dozens, hundreds, or in some cases, thousands of individual user logs." And it would take a similarly "massive amount of time" to sort through reports one by one, considering "the sheer number of reports generated" by any given agency, the EFF noted.
In some jurisdictions, cops are required to disclose when AI is used to generate reports. And some departments require it, the EFF found, which made the documents more easily searchable and in turn made some police departments more likely to respond to public records requests without charging excessive fees or requiring substantial delays. But at least one department in Indiana told the EFF that "we do not have the ability to create a list of reports created through Draft One. They are not searchable."
While not every cop can search their Draft One reports, Axon can, the EFF reported, suggesting that the company can track how much police use the tool better than police themselves can.
The EFF hopes its reporting will curtail the growing reliance on shady AI-generated police reports, which Guariglia told Ars risk becoming even more common in US policing without intervention.
In California, where some cops have long been using Draft One, a bill has been introduced that would require disclosures clarifying which parts of police reports are AI-generated. That law, if passed, would also "require the first draft created to be retained for as long as the final report is retained," which Guariglia told Ars would make Draft One automatically unlawful as currently designed. Utah is weighing a similar but less robust initiative, the EFF noted.
Guariglia told Ars that the EFF has talked to public defenders who worry how the proliferation of AI-generated police reports is "going to affect cross-examination" by potentially giving cops an easy scapegoat when accused of lying on the stand.
To avoid the issue entirely, at least one district attorney's office in King County, Washington, has banned AI police reports, citing "legitimate concerns about some of the products on the market now." Guariglia told Ars that one of the district attorney's top concerns was that using the AI tool could "jeopardize cases." The EFF is now urging "other prosecutors to follow suit and demand that police in their jurisdiction not unleash this new, unaccountable, and intentionally opaque AI product."
"Police should not be using AI to write police reports," Guariglia said. "There are just too many questions left unanswered about how AI would translate the audio of situations, whether police will actually edit those drafts, and whether the public will ever be able to tell what was written by a person and what was written by a computer. This is before we even get to the question of how these reports might lead to problems in an already unfair and untransparent criminal justice system."