Weather     Live Markets

The King County Prosecuting Attorney’s Office in Seattle has announced that it will not accept police reports generated using artificial intelligence (AI) due to concerns about potential errors. A memo from Daniel J. Clark, Chief Deputy of the Mainstream Criminal Division, highlighted issues such as references to officers who were not present at the scene and the possible consequences for cases, communities, and officers. The decision reflects a national discussion on the use of AI technologies in law enforcement and specifically mentioned AI tools like Open AI’s ChatGPT and Axon’s Draft One.

The memo was distributed to members of the King County Police Chiefs’ & Sheriff’s Association, emphasizing that all police reports must be produced entirely by the authoring officer without the assistance of AI. Clark acknowledged the time-consuming nature of writing police narratives and the staffing challenges many departments face, but expressed concerns about the accuracy and privacy implications of using AI-generated reports. The use of AI in law enforcement must comply with Criminal Justice Information Services (CJIS) regulations, and products like ChatGPT and Axon Draft One may not meet these standards.

Specific concerns were raised about Axon Draft One, which uses AI to review body-worn camera audio and compile a draft narrative for officers to edit and approve. However, issues with the technology, such as errors and “hallucinations,” as well as the lack of a draft record of changes made by the officer, raised doubts about the accuracy of the final reports. The memo warned against officers certifying false reports, which could have serious consequences for cases, communities, and officers, including Brady/PID ramifications.

The King County Prosecuting Attorney’s Office has engaged with Axon and participates in a national committee addressing AI concerns in law enforcement. While AI technology has the potential to assist law enforcement in the future, the current usage of AI in generating police reports is not deemed acceptable due to the risks involved. The decision not to accept AI-generated police reports will remain in place until concerns are addressed and the technology is further developed. The office is open to discussions and further dialogue on this issue with law enforcement partners.

Axon, the company behind Draft One, responded to the concerns raised in the memo by highlighting the safeguards built into their AI model to ensure accuracy and accountability in the information generated. They emphasized that police narrative reports remain the responsibility of officers and are subjected to editing, review, and approval by a human officer. Axon stated that Draft One was created with feedback from their Ethics and Equity Advisory Council and underwent studies demonstrating high-quality report narratives. They expressed commitment to collaborating with key stakeholders to gather feedback and perspectives on the use of AI technologies in law enforcement and the justice system.

Share.
Exit mobile version