shutterstock-352626623
Khakimullin Aleksandr
25 April 2016 Insurance

Data analysis: step 1

Sometimes you hear a comment that crystallises many trends that are converging at once. We think we heard such a comment when, at the AM Best annual conference in Arizona in March, Hamilton Insurance chairman and CEO Brian Duperreault said data analysis could be either a threat or an opportunity for brokers in a world where advanced analytics contributed to the intermediary’s value.

As to whether the broker could actually bring such a level of value he said “I think the answer is a qualified yes—if the broker or agent brings a level of expertise and counsel that far surpasses what the carrier offers or the client can determine himself. This means setting the gold standard for manipulating and interpreting data.”

There are several implications to his comments. Duperreault clearly notes that insurers and their clients are rapidly gaining better data analysis capabilities of their own data. This means that the insurance buyer is aiming to better understand his own data prior to going to the market. And anyone reading the industry trade press is aware of the herculean efforts currently exerted by insurers to better understand the insured’s data once they receive it.

What Duperreault is telling the brokers is that if they want to continue to bring value to a transaction they need to be able to understand, interpret and manipulate that data better than the owner of the data and better than the insurer who is setting rates based on that data.

This is a high bar for an intermediary to meet. It explains the vast sums invested in data analytics by the broker community. By any measure of effort Duperreault’s message, it would seem, has already been received by the broker industry.

As all of us know there’s a big difference between ‘getting the message’ and actually doing something about it. In this case well capitalised insurers with armies of actuaries and analysts have long since jumped into the race with both feet. They’ve been joined by players further down the risk chain including larger managing general agents (MGAs) and risk management units of large commercial insurance purchasers. Everyone is seeking a better understanding of the risks represented by the data.

Getting granular

At CATEX we began to hear this data drumbeat several years ago. Our Pivot Point System software provides transaction and claim detail down to the lowest granular risk level. We had assumed that underwriters and analysts would be thrilled with that level of extreme granularity so that virtually ‘atomic’ accounting and underwriting could be applied to each specific risk processed through the system.

They were, in fact, delighted. But it developed that clients generally did not have the granular risk data to load into our systems. All too often our systems were presented with only aggregate risk data by clients with the explanation that they simply had no more detailed information.

Already registered?

Login to your account

To request a FREE 2-week trial subscription, please signup.
NOTE - this can take up to 48hrs to be approved.

Two Weeks Free Trial

For multi-user price options, or to check if your company has an existing subscription that we can add you to for FREE, please email Elliot Field at efield@newtonmedia.co.uk or Adrian Tapping at atapping@newtonmedia.co.uk