The Impact of Big Data on Insurance
Updated: Jun 16, 2020
The use of data in insurance is still relatively limited. There are several high-profile examples, but these are isolated. The immediate future for the industry will be defined by a data race, with insurers trying to work out how best to analyze and use data to avoid being left behind by competitors. There is a strong argument that the biggest challenge facing insurers in the whole insurtech revolution is not coming from start-ups, tech companies, or expensive system upgrades but simply inactivity and allowing competitors to edge ahead. While incumbents have struggled to adapt, it is only a matter of time before some big players gain a competitive advantage by doing so.
Big Data and insurance regulation is a difficult obstacle
From an insurer’s point of view, the biggest issue regarding customer data is exactly where it can be used and what can be done with it. The general rule is that an insurer can do what it pleases as long as it has customer consent, although that is not always best practice. Consent is only required when the data is not individualized but aggregated. The most important news regarding data is GDPR, which has been implemented across the EU. The regulation was adopted in April 2016 and came into force in May 2018 following a two-year transition period. The regulation requires customers to have the ability to dispute automated decisions made on an algorithmic basis. Consent laws will be tightened, with it needing to be explicit for both the collection and use of data, with parental consent needed for children below the age of 13. Substantial fines and the need to disclose all information will be a legal requirement in the event of a cyberattack or data leak.
Big Data infrastructure and interoperability
The sophistication and reliability underpinning predictive models depends on the richness and quality of the data. Vendor positioning in this space can do much to support incumbent carriers to ensure their analytics models are optimized – particularly to support the increasing range of business partners contributing to a carrier’s data platform. Hence, a critique of the relevance and value of historical data sitting across disparate legacy systems is paramount. Culturally, incumbents can lean on partners to refine subsets of customer data. Equally, collaboration on best practice can accelerate the rate at which providers can progress along the predictive route. With the sector now geared towards IoT, considerable opportunities as well as considerable challenges are emerging. Hence, all of these fundamentals need to be anchored:
Masses of internal data such as claims files, policy slips, legal reports, or loss adjusters reports held as paper-based and PDF versions.
Digitization of these documents into valuable data remains a top priority among traditional insurers as they look to emulate insurtechs. Vendors are critical for the provision of digital document management systems and text mining tools.
Filtering external data in insurance so that useful information to support tactical insight is highlighted. As with other verticals sectors, identifying and incorporating meaningful data has proven to be problematic for insurers.
Meaningful use dictates robust data governance, which will increasingly involve regulatory compliance. Insurers spend too much time on data preparation. Moreover, the quality and consistency of the data extracted from different systems varies. Extraction of data from policy administrations and claims systems, then merging different sources of data while building the predictive model(s), is a very time consuming process.
Industry is suffering from a lack personnel shortages
As with other sectors, insurance is suffering from a dearth of IT talent. Data scientists/analysts are in even shorter supply, with line of business experience rare. Therefore, predictive analytics vendors should bear in mind their solutions do not necessarily fully support carriers’ strategic planning. Vendors should encourage carriers to work with developers to train algorithms so that the delivered outcomes can be interpreted to support line of business planning.
Commitment from the top needs improvement
We have already touched upon the importance of data analytics skills across insurance. Scaling high-value predictive algorithms necessitates entrusting predictive analytics platforms and tools to understand how they work, using them consistently, and integrating them with workflow. Frontline decision-makers such as the underwriter, marketing manager, or claim adjuster should utilize predictive analytics to help unleash their business value. But getting decision-makers to use predictive analytics tools has proven difficult due to a lack of trust in ML algorithms. Predictive analytics is the catalyst for cultural change within an organization and must be pushed by leadership. Day-to-day, continuous education of employees on the importance of the full integration of predictive analytics tools is equally vital. Despite these challenges some companies are embarking along this route, and there is a growing aspiration to invest in predictive analytics over the coming years. If investment is to be unlocked, vendors should look to support IT teams to build a business case and work to cultivate stronger appreciation among the C-suite. It is not surprising that the early adopter incumbents are those with strong leadership. The key reasons why CEOs/CFOs are hesitant to embrace predictive analytics are mainly associated with perceptions around time and money. CEOs/CFOs are concerned about the time and cost of implementation of predictive modeling. It is therefore important for vendors to build contextual solutions, and communicate the value of incorporating predictive analytics in ways that will secure buy-in. This requires the ability to translate IT goals into language the broader business can understand. Implementing a predictive analytics solution is just one piece of the puzzle – insurers will have to incorporate predictive analytics into a broader strategy with measurable goals to substantially outperform the market.
Insurers demand contextual solutions
The insurance sector lags far behind verticals such as retail and manufacturing, where predictive analytics are a core business function. Insurance carriers have undoubtedly become more receptive to the potential offered by predictive analytics, but life insurers are particularly lacking in how they plan and implement predictive analytics, suffering from a lack of predictive analytics skills, data silos, and legacy infrastructure that is diluting transformation initiatives and deterring investment. Contextualised solutions tailored towards cost savings and shorter lifecycle implementation will see increasing demand.
Given that predictive analytics cuts across the entire insurance value chain and offers new tools to understand risk and sustain profitability, vendors and their partners can base their propositions on a variety of approaches:
Data digitisation solutions that help traditional insurers digitise and extract insights from paper and handwritten data that resides in insurer or broker systems, digital document management systems, text mining, and manual re-keying of specific data items into structured data formats is crucial to build reliable predictive models.
Managed services are increasingly in demand yet remain under-resourced, primarily as insurance carriers develop their own data ecosystems. Data-as-a-service vendors will play an important role in providing insurance with filtered and sliced data from the web and IoT. This data would feed into predictive models and can provide a stronger understanding of risk profiling, while enhancing customer engagement through targeted marketing campaigns.
Cloud-based predictive analytics is gaining traction, particularly among SME carriers, which are arguably more appreciative of the pay-as-you-go model that accompanies the cloud. Clinging to onsite but disconnected systems makes it difficult to lower the total cost of ownership and develop analytical models collaboratively while sharing insights.
There is high demand for contributory data and innovative modeling, supported by a “forward and backward looking” approach that blends actuarial models, science advancements, financial demographics, and government data. Vendors must shore up their solutions with standardized industry databases and sophisticated tools that allow analysts and underwriters to analyze policies, segments, and entire portfolios against a common industry database.
Non-technical predictive analytics are mandatory capabilities across insurers’ business units, and hence require solutions that can support non-analytics experts, such as marketing managers. User GDIN-TR-S005 Big Data in Insurance 27 June 2018 www.globaldata.com 20 interfaces with interactive visualization capabilities will support self-service to enable collateral design and generate wider buy-in.
Embedded predictive analytics solutions: most insurance systems work in isolation, complicating any effort to make meaningful use of existing data sets to better support customer insights and profile building. Consequently, there is an emerging push to embed analytics capabilities within core systems such as underwriting and claims systems. This in turn will accelerate the rate at which underwriters and adjusters can respond to volumes of claims and settlements