Tackling loss history challenges in SMB workers' comp

It’s time to put a stop to underwriters ‘flying blind’

Tackling loss history challenges in SMB workers' comp

Workers Comp

By Bethan Moorcraft

Small businesses are the lifeblood of the US economy. According to the US Small Business Administration, there are approximately 30.2 million small businesses operating across the nation, employing 47.5% of US employees. It’s a busy and bustling market for workers’ compensation insurance carriers to tap into, but there are a few roadblocks that keep getting in the way.

One pain point is cost. Many small business prospects are new or immature market ventures without any prior loss history. Without that claims data, underwriters have to put more time and effort into their risk selection and discretionary premium allocation. Often, that extra work is deemed too costly for carriers to justify focusing on low-margin accounts in the small business sector.

To get around that, many insurers have invested in automation and technology to help them evaluate small commercial policies. Solutions like ‘straight-through processing’ and ‘real-time quoting’ are now household names in the insurance industry, especially among companies who realize the efficiencies needed in the small commercial marketplace. They’re using data and analytics to quickly and accurately price policies for small commercial accounts, which they’re then distributing via the desired on-demand customer experience for both insurance agents and policyholders.   

But these data and analytics tools only work if carriers have access to the right data, which is not always possible. To deal with this, insurity company Valen Analytics has unveiled a new ‘Unavailable Loss History Model’ for workers’ compensation – a predictive model that can accurately price and assess risk when there is no loss history available. The model uses a combination of third-party and synthetic variables derived from Valen’s extensive workers’ compensation data consortium – a granular data set of approximately 650,000 policies presenting $7.6 billion in premium - to indicate risk quality based on how similar policies have performed in the past.

“Being able to deliver valuable insights despite the absence of a key piece of information like loss history is a true testament to the power of partnering around data,” said Kirstin Marr, president of Valen Analytics. “As insurers pursue the small commercial market, the ability to accurately assess risk, while collecting less information on the application is crucial to gaining market share.

“We think our new Unavailable Loss History Model will help our carrier partners get into the small commercial market. The reality is, there’s no easy way to get into this market and to spend an enormous amount of time manually evaluating a policy like they would for medium- or large-sized accounts – they simply can’t make money as a business doing that. So, they have to invest in more advanced, automated data solutions that enable them to assess the risk, validate the business, price it accurately, and feel confident that they’re managing the risks of their own growing portfolios.”

Typically, carriers looking to price policies for accounts without prior loss history would run the account through a loss history model and then make an educated guess based on the results of the model. They’re not completely flying blind, but they’re not producing an optimized result either, meaning there’s some variation in terms of result reliability. What ends up happening is that some policies that may actually be good risks don’t get the rate they should, and others that are slightly riskier get slightly better terms than they should.

“We’ve been able to amass a number of policies, which, at the time they were written didn’t have a loss history, but because we’ve had our data consortium maturing for so long, those policies now have a loss history,” explained Marr. “That allows us to go back and figure out how to predict for policies that are coming in today without that loss history. It’s the size, breadth and depth of our data consortium that allows us to do that.”     

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!