New State Law Requires Additional Safeguards When Police Use Generative AI

New State Law Requires Additional Safeguards When Police Use Generative AI

A newly signed California law mandates that law-enforcement agencies must be transparent when they use generative AI to produce or assist in preparing police reports. The law, known as Senate Bill 524 (SB 524), requires a clear disclosure on each page of any report prepared “fully or in part” by AI, identifying which program was used and stating: “This report was written either fully or in part using artificial intelligence.”

In addition to the disclosure requirement, SB 524 obliges agencies to retain both the original AI-draft and the final officer-reviewed version of the report for the same duration. It also requires an audit trail that links the person who generated the draft, any editing done, and the audio or video footage fed into the AI system (for example, body-worn or dash-mounted camera data).

Advocates argue the law enhances accountability and transparency: since police reports are foundational in criminal investigations and judicial decisions, knowing when AI is involved is critical for due process. However, opponents—including police unions and chiefs’ associations—warn that the measures may introduce administrative burdens, undermine officer credibility (by implying AI-assisted reports are less reliable), and complicate report workflows.

About the author

TOOLHUNT

Effortlessly find the right tools for the job.

TOOLHUNT

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to TOOLHUNT.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.