Please set up your API key!

The Rough Notes Company Inc.



December 27
09:09 2018

ISO Emerging Issues Perspective

By Rick Stoll and Diane Injic


Agents can be gatekeepers

From tiny acorns big oaks can grow. Errors tend to work in the same way. Inaccuracies in rating due to incomplete or faulty information drawn from policyholders can lead to missing premiums and unexpected claims. Two real-world examples shed light on how leaks can begin undetected—and then drain the bottom line.

An agent lands a commercial account and places the general liability coverage with an insurer at an annual premium of $134. It’s a modest premium befitting a modest risk based on the information supplied by the policyholder to the insurer. But with the appropriate underwriting and rating information, the premium should be closer to $7,000.

Another agent classifies a fleet of 96 commercial vehicles as local to Raleigh, North Carolina, assuming the vehicles primarily operate within a 50-mile radius. But the policyholder’s drivers spend most of their time much farther afield, ranging as far as Charlotte, 150 miles away. The potential missing premium could be as much as $85,000.

Those scenarios may sound far-fetched, but both actually happened. And while the gap in numbers may seem extraordinary, similar missteps occur regularly, if not always on the same scale. Premium leakage is a multibillion-dollar problem that costs many commercial insurers dearly in uncollected revenue, with a side effect of misclassified risk that may take a further unexpected toll in claims.

Premium leakage is a multibillion-dollar problem that costs many commercial insurers dearly in uncollected revenue, with a side effect of misclassified risk that may take a further unexpected toll in claims.

Verisk’s recent market analyses of commercial property and commercial auto uncovered four-year premium leakage of $4.5 billion and $6.4 billion, respectively, on just those two lines of business. The study involved only three rating criteria across the two lines—although it’s believed that many more often generate leakage.

The account in the first example, based on limits of $1 million per occurrence and $2 million aggregate, was assigned a NAICS (North American Industry Classification System) code denoting lessors of nonresidential buildings, with premium based on location and products and completed operations coverage. The actual business was new single-family housing construction, for which premium is typically more appropriately based on payroll. The rate based on payroll was more than 50 times what the customer was being charged.

The discrepancy in the commercial auto account was discovered through a review of vehicle sightings—some 69% were outside the classified radius. Beyond this specific instance, a recent Verisk study found that 23% of vehicles rated as local are actually operating beyond their 50-mile radius.

Whether they like it or not, agents are very often on the front lines in the fight against premium leakage. An agency that does right by collecting the appropriate information for the insurers it represents can strengthen relationships with a growing bond of trust.

Biggest holes

Preventing premium leakage in commercial insurance starts with knowing where the vulnerabilities are—which lines are most prone to errors and which rating criteria have the biggest impact on premium and the greatest likelihood of being captured incorrectly.

  • In property, construction class and ISO’s Public Protection Classification (PPC®) are most likely to be incorrect, and many insurers are prone to missing the mark on insurance to value.
  • In casualty, NAICS codes often can be wrong, as seen in the example above, and figures for number of employees and revenue can be off.
  • Auto also is prone to NAICS coding and territory errors, and the actual operating radius may be greater than what’s identified for rating purposes.
  • Many medical professional liability premiums may be based on the wrong specialty or territory, and they often miscalculate the number of patient interactions.
  • In cyber, NAICS codes again tend to be problematic, as does the presence of payment card information, and it may be hard to get an accurate count of the number of records.

How the leaks start

Premium leakage has many possible causes. The following is a partial listing.

Uninformed business owners and agents. Business owners and agents want fast responses from their insurers, very often with minimal input. In a 2017 survey, 33% of agents said speed was the leading factor in selecting a preferred insurer. In such an environment, it can be hard to spend time finding answers to insurers’ questions about items such as PPC, construction classification, or a 6-digit NAICS code.

Data entry errors. A typical customer workflow has multiple manual steps that create ample opportunity to introduce mistakes. The agent or customer service representative very often manually captures information from the applicant and keys the data into the agency management system. The agency often then emails a PDF to the insurer, where an underwriting assistant likely rekeys the data into the policy administration system. User-friendly agent portals, maximized use of prefill data, limit-free text in forms, and in-app validation of rating factors can go a long way in minimizing errors. Future solutions may incorporate machine learning for optical character recognition to further reduce mistakes.

Rate evasion and fraud. This can be a learned behavior based on weaknesses in insurers’ systems. Also, savvy insureds often know the most expensive parts of their policies and where they can save the most money by manipulating reported rating variables. Agents should expect vigilant insurers to monitor them regularly, to watch incoming business—especially when rates are changing—and to act when incoming mixes of business quotes and binds deviate from expectations.

Risk segmentation. This is where details such as appropriate NAICS codes become critical. Beyond that, much of the burden falls on the insurer to use granular data for the best understanding of the exposure, apply the data effectively for risk selection and pricing, and use the emerging power of predictive analytics to its greatest advantage. Ideally, this happens without placing demands on agents for more input that bogs down the process.

Unverified data. This issue turns on the availability of information such as radius, territory, NAICS, construction class, ITV data, PPC, employee count, and revenue. Time is another critical factor: Most insureds want to get through the application process quickly and get back to making money. This means that a fast, easy, trusted process wins the day for agents. The wrong data strategy can breed frustration and undermine trust for agents and their clients if uprating occurs after bind.

A mix of human insight and digital intelligence can help insurers monitor incoming business for signs of potential rate evasion or other forms of leakage. Predictive analytics at the insurer level can help flag indicators of bad information, whether it’s introduced by unintended error or fraudulent intent. The agent or broker can be an important gatekeeper at the outset of the client relationship in gathering and verifying critical data. The right information helps the insurer apply the right level of segmentation, which means a better chance of bringing in quality business that the insurer will likely renew.

The author

Rick Stoll is vice president of commercial underwriting products for ISO, a Verisk (Nasdaq:VRSK) business. Diane Injic is director of commercial auto underwriting for ISO.

Related Articles






Philadelphia Let's Talk - Click Here

Spread The Word & Share This Page

Trending Tweets