Utah Police Say AI Wrote a Report Claiming an Officer Turned Into a Frog

HERBER CITY, UT – A police department out of Utah has admitted that the experimental AI-driven software meant to tackle the tedious nature of police report writing isn’t without its problems after a police report written by the software reportedly claimed an officer transformed into a frog.

The speed at which AI has progressed in recent years is undoubtedly remarkable. From image and video generation to large language model (LLM) systems, the possibilities of these tools when integrated into various workforces is still yet to be fully realized, but some trial runs across differing industries have shown that there’s still work to be done when working alongside AI.

Between court filings drafted by AI fabricating court case citations to stars like Will Smith being exposed for using AI to enhance crowd sizes for his musical performances where generated attendees looked deformed, AI is still an imperfect tool. In the latest instance of AI gone slightly awry, the Herber City Police Department (HCPD) out of Utah learned what happens when certain background audio is picked up by their AI report writing software.

According to HCPD officials, the outfit began experimenting with an AI-powered software called Draft One, which said software aims to handle the often arduous task of writing police reports based upon bodycam footage fed to the program. However, while observant minds reviewing bodycam footage can discern between audio coming from sources like movies playing in the background and direct witness statements from present parties, Draft One apparently has some difficulty with the aforementioned.

After news spread of an AI-generated police report from HCPD required manual correcting after initially claiming that an officer transformed into a frog during a call, officials from the department explained the source of the snafu.

According to HCPD Sergeant Rick Keel, “The body cam software and the AI report writing software picked up on the movie that was playing in the background, which happened to be ‘The Princess and the Frog.’ That’s when we learned the importance of correcting these AI-generated reports.”

Despite the error discovered in the AI-generated police report, Sergeant Keel still sees the potential in AI-driven solutions in the field of police work, highlighting that said report-writing solutions have personally saved him an average of “six to eight hours” of work per week when operating as intended.

A second AI-driven solution being tested by the department is called Code Four, which functions in a similar fashion to Draft One as it handles report writing when fed bodycam footage. While the trial run for the aforementioned AI solution wraps up in February, Sergeant Keel has confirmed the department will continue to use AI tools, although they haven’t settled on which of the two they’ll go with long term.

 Nonetheless, while said AI solutions look promising to police agencies across the country, the adoption of these tools is not without their critics. This past July, the Electronic Frontier Foundation published a piece on the concerns revolving specifically around the Draft One AI software.
 
For corrections or revisions, click here.
The opinions reflected in this article are not necessarily the opinions of LET
Sign in to comment

Comments

Powered by LET CMS™ Comments

ADVERTISEMENT

Get latest news delivered daily!

We will send you breaking news right to your inbox

ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
image
© 2026 Law Enforcement Today, Privacy Policy