How AI and big data sources can transform underwriting


How AI and big data sources can transform underwriting

Leandro DalleMule, general manager, North America, Planck

A panel discussion, part of Intelligent Insurer’s Underwriting Innovation USA virtual conference, will address how to harness new sources of external data and integrate into underwriting to support better decision-making. Panellist Leandro DalleMule, general manager, North America for Planck, outlines what is in store.

Leandro DalleMule, general manager, North America for Planck, an AI-based data platform for insurance underwriting, is taking part in the online Intelligent Insurer Underwriting Innovation USA conference running from November 10 to 12, and will be a panellist in a discussion titled “Harness New Sources of External Data and Integrate into Underwriting to Support Better Decision-Making”.

Artificial intelligence (AI) and new sources of data have game-changing potential for insurers, driving underwriting profitability, reducing costs and speeding up processes.

Here he explains what delegates can expect from the session.

What are the main topics you aim to address in the discussion?
I will offer insights into how insurers can leverage big data and AI to greatly improve their processes and ultimately reduce expense and loss ratios, and even grow gross written premium at the same time.

I can offer very specific examples of how it is done today, challenges, results and required steps to turn these technologies into successful business engagements.

Why are these topics especially relevant today?
Data has always been important to insurance—I would argue it’s the most important aspect of the industry. Nowadays, data and technology have achieved a level of sophistication that allows carriers to be much more profitable than before.

Many new companies, insurance startups as well as some well-established ones, are already taking advantage of tech and data to get ahead of their competitors. It is just a matter of time. Those who wait will probably not last much longer in the market.

Tell us about Planck and how its offerings fit with these topics.
Planck is a leading AI platform built to enable commercial insurers to instantly and accurately underwrite any business. Planck’s platform aggregates and mines massive datasets, using the latest advances in AI to automatically generate and deliver key insights customised to all relevant commercial underwriting processes.

The result is a frictionless underwriting process with greater insurer visibility into risk factors, leading to improved new business conversion and retention rates and lower loss ratios.

Planck’s platform brings automation and intelligence to the underwriting process—empowering commercial insurers to focus on underwriting that truly requires human expertise.

How well has the industry embraced new sources of external data, and how could it engage better with these?
Very well, although the innovation pace of insurance is not exactly fast. These are proven technologies now and there is no better way to engage than to get started as soon as possible, defining a couple of use cases, such as underwriting support or application pre-filling, and running with them.

The risk is quite low to none, and yet we see many insurers leaving money on the table, lots of it.

What are the opportunities the insurance industry now has when it comes to new sources of external data?
Insurance is all about analysing data to assess risks. But there is a flaw in this methodology, especially in commercial insurance. The data used to underwrite clients is mostly generated and sent to insurance companies by agents, who in turn have their own incentives and priorities. This has created a situation where the data that insurers receive is neither very accurate nor complete.

In order to solve this huge pain point, Planck created an AI platform that is able to generate commercial underwriting insights in real time, from public data sources such as the open web. This is not about simply finding a needle in a haystack.

The data insurers are looking for is not clearly visible on the web (only about 12 percent of it is plainly visible and can be returned by aggregating many different sources). In order to return the relevant, accurate data needed, you have to use AI models to create it: creating gold out of the hay.

For example, think about a bar’s percentage of liquor sales out of its total revenue. In order to create this data point, we use deep learning models that have been trained on audited datasets, converting intermediate insights into the requested insight.

Those intermediate insights (for example, the number of people standing vs. the number sitting, the intensity of the light, the number of beer bottles and wine glasses compared to the number of dishes and plates) are created using machine learning techniques as well.

Do these new data sources bring any challenges?
Most of the challenges we have seen are related to change management, not the technology at all. Successful executions, as happens with most initiatives, start with senior level commitment and sponsorship. 

What do you hope attendees will get out of your panel discussion?
We hope attendees leave with a very clear understanding of how AI and big data are already transforming the industry.

For those who are already working with such technologies, great, but we hope they will leave understanding their full potential.

For those who are not yet working with it, I hope they leave with a renewed sense of urgency and understand that these are well-proven solutions their competitors are already using. The time to act is now.

Leandro DalleMule is one of the speakers at Intelligent Insurer’s Underwriting Innovation USA, a virtual event addressing how to accelerate underwriting profitability in a world of changing risks, taking place between November 10 and November 12, 2020.

Full details can be found here

Planck, Artificial Intelligence, Underwriting, Insurance, Reinsurance, Leandro DalleMule, North America

Intelligent Insurer