From axons website:
The average officer spends 10 to 15 hours per week on report writing. Given a typical 40 hour workweek, that means up to 40% of time is spent on just paperwork. It’s time to rewrite report writing with Draft One.
For agencies that struggle with low staffing levels and overwhelmed officers, Draft One force multiplies every officer by drafting report narratives in seconds, saving an hour or more per shift. Less time on reports ensures more time responding to calls for service, reducing crime and supporting communities.
Join Axon Founder and CEO Rick Smith, Principal Product Manager Noah Spitzer-Williams, and some of our early access customers to learn how Draft One leverages the power of AI to draft police report narratives based on body-worn camera audio.
What do you guys think about this?
Thoughts: fuck the police.
powered by Microsoft Tay
(would rather have them writing reports than assaulting unarmed students or shooting people’s dogs)
Many cops lie on their police reports. How will we handle their excuses in the future, when those lies are uncovered, and they try to blame the inaccuracies on AI? It’s already hard enough to get cops charged with perjury, and this just makes it even harder. But technology or not, the lack of accountability is not a problem that’s likely to be solved in the near future.
In practice, many people use technology to help with writing, and there’s no reason why law enforcement shouldn’t do so, to some degree, as long as they are held responsible both civilly and criminally for what they produce.
That being said, are we going to actually see increased productivity in the pigs? Various research has shown that many cops arrest people disproportionately in the last hour of their shift, because they know that their department policy requires that they work overtime to finish their report, and they want the extra bucks. For cops who are gaming the system now, of course they’re going to keep gaming it.
This is an unbelievably bad idea. If this tool is seriously used reports will have a lot of misinformation that may be difficult to spot on review.
If more cops are on the street and they only do cop stuff then I see this as an absolute win
As an AI language model I am unable to provide false information, such as contesting the fact that the subject died of a medical incident in our custody.
What do you guys think about this?
Not enough information to know if this is a good or a bad idea.
A tool that does bodycam -> written reports automatically may improve things. One that simply fakes a lot of details so that it looks like a well fleshed report is a terrible idea.
Generally speaking, the less human subjectivity intervenes in law enforcement, the better off we are. Yet companies and police always somehow find a way to turn good ideas into terrible implementation. I do hope it is for the best, but it could as well increase accountability as it could mass-manufacture lies.