There is an old adage in the insurance business that any risk can be profitable provided it is appropriately priced. This underpins the critical role of pricing in profitability and the marketplace. Price too low and fall victim to adverse selection. Price too high and lose revenues to competitors.
Given the inherent internal business buy-in and external regulatory hurdles to setting rates by refreshing rate filings, speed to market is key; and the industry data speaks for itself. Take Allstate, for example, which after declaring a renewed focus on rate filings, showed sustained year-over-year improvement in loss ratio relative to industry while holding market share nearly constant. And so, the verdict is in – rate filings matter and insurers must take a hard look inwards to understand their bottlenecks.
If excellence in the rate filings process is so important, then why do carriers struggle with it so much? The answer comes down to
three major realities, the first of which is
analytical complexity. The seemingly straight-forward task of creating model-ready data takes up time available for modelling decisions and reduces the ability to iterate over different assumptions. Also, as the prevalence of machine learning techniques like Gradient Boosting expand, the traditional Generalized Linear Model (GLM) approach to pricing is becoming increasingly commoditized. Whether a given carrier chooses to augment its GLM’s with automated feature engineering algorithms or embrace a broader ensemble of ML techniques, the net impact is increasing computational demands.
The second hurdle is
competitive intelligence, notorious for being a tedious, laborious activity requiring sifting through mountains of competitor rate filings (yes – an impediment for a carrier getting out its own rate filings is timely and adequate analysis of competitors’ rate filings). Despite common industry solutions, stakeholders are still left manually combing through tens of thousands of pages looking for the few nuggets most relevant to pricing a given line of business and coverage. Analytically driven operational improvements are required to automatically proliferate human derived observations on a select few filings across the broader set of competitive filings.
The third hurdle is
developing the requisite trust with internal decision-makers and subsequently state level regulatory approval. Trust is a prerequisite to overcoming inherent skepticism of analytics, machine learning, and artificial intelligence, and comprised of three pillars:
- Explainability: There is an optimal point of understanding about just when incremental model complexity produces additional statistical lift. This is the essence of explainability, providing sufficient visibility and understanding of models that yet preserves strategic differentiation.
- Fairness: A construct varying by jurisdiction – be it at the state, provincial, or national level – typically demands that pricing not be unfairly discriminatory. As emerging data sources continue to shape the landscape of possible pricing parameters and circumvent otherwise verboten variables, the topic of fairness is incredibly relevant in achieving internal buy-in and regulatory approval.
- Lineage: Users of any analytical, model or AI output must have a clear line of sight into the data utilized to generate the desired outputs and action-supporting insights. Results must be human interpretable.
To realize the profitability and market share benefits of more agile pricing processes, insurers must have the appropriate pricing platform in place, the relevant components of which align to addressing the aforementioned key challenges. As the volume and variety of data available grows at an exponential clip, a data platform with linear scalability is critical to orchestrating model ready data sets that meet the demands of R&D actuaries and data scientists.
Acknowledging the need for multi-genre analytics, a robust
text analytics capability is required to digest and leverage competitor rate filings. The Property & Casualty segment of the insurance industry alone generates
20,000 rate filings per month and while there is no shortage of noise in this massive amount of documentation, tremendous value hides in plain sight once the relevant competitor filings are found.
To deliver the requisite transparency and build trust with stakeholders, data movement must be reduced if not outright eliminated. Chronic data movement erodes faith in data quality while soaking up time and valuable resources to complete data validations and checks. A unified pricing platform reaches the data it needs while minimizing replication so that most of the effort is spent on business strategy rather than underlying mechanics.
The bottom-line question is are you ready to
deploy the technology that will move the company well into the 21
st century?
Tim is an Insurance Industry Consultant at Teradata. He works across all major aspects of the insurance business value chain to derive business value with data and analytics. Having started his career as a reserving actuary in the Big 4, Tim is constantly straddling the lines between high-level strategy and the minutiae of data. Tim engages clients to improve core operations such as marketing, underwriting, claims, actuarial, and finance.
Tim started with Teradata in late 2018 and over the course of his tenure has worn both the hat of Industry Consultant and Business Consultant. While his focus is on insurance clients, the lines often blur, and he frequently finds himself working with broader FSI clients, Healthcare, and Government industries.
View all posts by Tim King