
Artificial intelligence can help to gather data and increase efficiency, but it comes with unique liability risks. Reaping the benefits while minimizing exposure will be an ongoing insurance challenge, according to experts at Munich Re Specialty.
Insuring law enforcement agencies has always required a large appetite for risk. Given the dangerous nature of the job, changing laws and regulations, and the role of reputation in helping officers do their jobs safely and effectively, carriers have a big task in evaluating and covering the exposures a law enforcement agency faces.
Artificial intelligence and data analytics are now reshaping what those exposures look like.
On one hand, any new technology used by police forces and underwriters that gathers, organizes, and analyzes data helps underwriters assess risk more accurately, and provides crucial information needed when a claim arises. On the other hand, every new technology comes with a downside — the potential to fail or the potential to be hacked by bad actors.
Parsing the pros and cons of new analytical tools will be an ongoing process as technology continues to evolve, but forward-looking insurers are ready to harness the opportunities.
The challenges of underwriting law enforcement
A shifting legal and regulatory environment significantly impacts the risk profile of law enforcement agencies.
“Some states have limited or eliminated qualified immunity for police officers. Some require state authorities to implement a decertification process to revoke an officer's certification or license should an officer commit misconduct or fail to meet requirements. Transparency and freedom of information laws differ from state to state,” said Thanh Hoang, SVP, Public Entity Risk Solutions underwriter, Munich Re Specialty – North America. “Changes in the legal and regulatory landscape mean liability risk exposures are always a moving target.”
The cost of law enforcement claims also continues to rise, thanks in part to legal system abuse such as third-party litigation financing and the effects of nuclear verdicts.
“We are starting to see these claims settling a lot higher, especially post-COVID. That certainly impacts law enforcement coverages,” Hoang said.
Tracking these changes and staying on top of compliance with state laws and regulations requires police forces to collect and consolidate a lot of information. This is where technology can make a difference.
The impact of AI on underwriting
“As underwriting law enforcement becomes more complex, underwriters are asking for a lot more information, and the data comes from multiple different sources,” Hoang said.
AI-enabled platforms implemented by police forces and underwriters help to collect that data, and sort and organize it according to set criteria. This allows underwriters to more quickly identify the information they need to assess risk.
“That criteria could include factors such as use of tasers or chemical sprays, the use of deadly force, the frequency of civilian complaints, etc. AI can also quickly identify characteristics of the workforce, such as prior decertification in other jurisdictions. It has the power to examine the community relationship by scanning social media posts as they relate to police incidents and regulatory changes. That’s difficult to monitor, and AI does a good job of gathering the information that underwriters need,” Hoang said.
By taking over the time-consuming and tedious work of gathering and organizing data from submissions and public records in accordance with state laws and regulations, AI enables underwriters to focus more of their time and energy on evaluating complex risks and working directly with clients to build those relationships.
AI in risk mitigation
The adoption of AI technology among law enforcement agencies also aids in risk mitigation. In particular, body cameras have helped to collect crucial data that can protect officers in the event of a lawsuit. They also ensure more accurate event reporting, which helps to lessen the burden of compliance with more complete documentation.
“AI deployed in body cam technologies might help build defense against potential lawsuits, especially because you can view an event from multiple angles from each camera present on the scene,” said Daniel Foster, SVP, casualty loss control expert, Munich Re Specialty – North America.
“AI technology can also analyze that footage to create coaching opportunities, so a chief or a trainer can go back to officers and help with corrective action before an incident or a loss happens.”
Unique risks of AI
As the adoption of AI technology by law enforcement organizations increases, the effects of any failures have the potential to exacerbate existing problems and to create new ones.
“We are still determining what kind of losses can be caused by AI. If AI malfunctions, who does it impact and how?” Foster said. “A top concern is the potential for AI-related lawsuits due to things like misidentification by facial recognition technologies and other privacy concerns.”
In some cases, there is also the potential for bodily injury. AI-assisted robotic dogs, for example, could cause harm in the event of malfunction. Even for simpler applications of AI, such as the compilation of reports using ChatGPT, there is the possibility of inaccuracy that raises the risk of noncompliance with state and local laws and reputational damage.
In cases like these, it’s not yet clear who would be held responsible — the user of the technology or its creator?
Questions around liability resulting from the use of AI will likely be determined by courts as loss events inevitably occur. In the meantime, the risk can be minimized through the development of clear guidelines governing AI use, application, and controls. AI is to be applied and controlled.

Underwriters definitely want to see what controls are in place around the use of AI by law enforcement, and how any information generated by AI is being validated or verified.
A forward-looking approach to insurance
AI will only continue to evolve. Harnessing its benefits while minimizing its risks will be an ongoing challenge for users and insurers alike. As we are just beginning to look at AI together with brokers and clients, Munich Re Specialty has taken a forward-looking approach, examining the potential use cases for AI within law enforcement and tracking relevant regulatory changes.
“We have taken seriously the direction that we're seeing law enforcement go with the incorporation of artificial intelligence, technology, and a data-driven approach. There have also been a lot of regulatory changes related to AI and law enforcement, particularly at the state level. We have to evolve and adapt to these changes. So we've been building out modeling to help us underwrite these risks going forward,” Hoang said.

We take a holistic approach. We're looking at a lot of factors and proactively trying to create solutions to underwrite a challenging law enforcement cover. That might involve crafting new structures to existing coverages or looking to provide more capacity.
Our expert


Related material
Related Solutions
Newsletter
properties.trackTitle
properties.trackSubtitle