By Kirstin Marr, President, Valen Analytics
An increasing number of customers expect tech-forward solutions to age-old problems and one of those problems is the complexity of insurance. This very transition in customer expectations is the driving force behind the rise of InsureTechs. While we have seen some advanced solutions play out in personal lines where claim submission is as easy as sharing a moment on Snapchat, workers’ compensation insurers are still figuring out ways to provide smart solutions at a quicker pace and as a result, are increasingly relying on data and analytics.
Today, insurers have access to a wide array of data sources that were unimaginable only 20 years ago, from wearables to drones and social media. As these sources continue to mature, they promise even better insight for underwriters and adjusters. Armed with this data, insurers are seeking to improve efficiency, and make better and faster decisions. However, insurers continue to face challenges in determining what data sources will help them answer real-world questions. This is a burden even without the added pressure of a market that’s already highly competitive on rates.
Newer Data Sources are Invaluable for Workers’ Comp
Workers’ compensation insurers base their calculations on a handful of factors related to each incident. Individual data points such as injury type, age and relevant past health issues of the claimant factor into how claims adjusters proceed.
Today, insurers have access to a rapidly growing number of data sources about things that might influence the outcome of a claim. There are new sources of data that provide real-time feeds of payroll and financial transactions for commercial businesses. It’s now possible to obtain much more granular updates on legal cases and settlements as well. We are witnessing new types of data like social media, wearables and property sensors gain a foothold and with it seeing data being elevated to a strategic level within the C-Suite of insurers.
With so many options, insurers are tasked with separating what’s real from what’s mostly hype. As a result, the problem of identifying the right sources must be approached with clear business questions in mind. For example, when a new source is incorporated into existing decision-making flows, will it yield actionable insights? How much will the incorporation of these sources improve performance?
This can be viewed from a process enhancement standpoint or one of model enhancement. The former looks for variables that can inform end users at critical decision points, while the latter is aimed at identifying variables that produce additional lift in an existing model. In either case, the ongoing availability and reliability of the data should be considered to make sure insights can be delivered timely and accurately. And finally, ask whether the cost of acquiring, analyzing and maintaining data from the new source be reasonable in the long-term.
Data Volume’s Role in Boosting Predictive Results
A high volume of data opens new avenues for insurers to predict how similar policies will perform (even when critical data is missing.)
An applicant’s loss experience has always been one of the most important factors in assessing risk, but as the industry moves toward the automation of underwriting at scale, companies must find ways to accurately evaluate policies without this information. Failure to do so could mean missing out on the underserved and highly coveted small commercial market. By analyzing hundreds of thousands of policies that share similar characteristics, advanced data analytics can better inform pricing and risk selection decisions in new and untested geographies.
A large pool of information enables insurers to transform datasets into highly predictive compound variables through repeated, iterative testing. According to a recent Valen study, the synthetic variables that come from consortium data can offer up to 13 times the predictive power of a single insurer’s policy data. By partnering around data, insurers are able to improve the shared understanding of the market and customers.
The abundance of data and advanced analytics provide insurers with the opportunity to paint a dynamic, real-time view of risk. Staying ahead of the curve in workers’ compensation involves more than implementing the newest, shiniest tool. If data does not inform a specific business case or have a protocol for how it will be regularly updated and maintained over time, it’s just noise. A disciplined evaluation of third-party data is key as insurers figure out which emerging sources help answer real-world business questions and drive decision-making.
About Kirstin Marr
Kirstin Marr is the president of Valen Analytics, an Insurity company, and provider of proprietary data, analytics and predictive modeling for property and casualty insurers. Prior to her role as president, Kirstin was the chief marketing officer of the company, and one of the pioneers of the Insurance Careers Movement coalition, a grassroots initiative of more than 1,000 insurance organizations raising awareness of what insurance has to offer Millennials. Previous to Valen, Kirstin ran B2B marketing for internet technology pioneer and market leader, ServiceMagic.com (now HomeAdvisor). Kirstin has a passion for building companies that invent leading-edge technologies to improve customers’ lives and solve the inefficiencies that exist in traditional marketplaces.