The Future of Actuarial Modeling

  • Blog
  • January 05, 2024

Yusuf Abdullah

Director, Risk Modeling Services, PwC US

Email

The introduction of the computer and technological advancements in the 1990s meant that actuaries were able to move away from basic commutation factors to full cash flow modeling and eventually stochastic modeling as well. A continuous race between technology improvements and complexity of actuarial modeling has continued over the last few decades (e.g., volume, precision, complexity, accounting changes) and we expect to see this same cycle to continue as expanding technology power (i.e., cloud, GPU and Quantum computing) enables even greater modeling complexity.

In this article we discuss the current state of actuarial modeling, recent technological trends, and highlighting how different vendors are incorporating this technology in their modeling solutions.

future of actuarial modeling

Current state of actuarial modeling

Large vendors dominate market

The market share for actuarial modeling has historically been dominated by a handful of established vendors, having gained in popularity due to their first-mover advantage and continued strong core projection capabilities. They generally have standard libraries that support a range of product sets, accounting and internal reporting bases for large developed markets like the US, the UK, and continental Europe. A mixture of open and closed architectures have emerged as a result of insurers' preferences for vendor maintained solutions versus flexibility to adapt models to reflect their products and individual methodologies and approaches. However, the complexity of these systems and cost or timeline to change has often created a barrier to entry for new entrants and led to difficulties for insurers to embrace and adopt the latest technological innovations as we have seen in less complex industries.

But as we have seen in prior industrial revolutions, the status quo is often being challenged by a new wave of actuarial platform solutions that are leveraging the latest technology to deliver cloud native solutions, provide greater flexibility, are based on more intuitive coding languages, and adopt modern software development practices. These solutions can provide greater computational speeds, the freedom to choose cloud providers, and the ability to more easily integrate with modern finance and data technology solutions often at a competitive price point relative to the cost of the dominant vendor solutions.

Companies maintaining more models than necessary

Despite quite some consolidation activity over the last few years largely driven by mandated accounting change requirements, many companies missed an opportunity to modernize and still operate multi model environments. In turn, we see companies maintaining multiple teams of modelers, sometimes including key individuals that maintain legacy models such as those written in APL or COBOL or even some more recent actuarial software that has since been sunsetted. These multiple models also lead to inefficiencies with consolidating model output with a variety of formats prior to posting results to the ledger, and this effort further compounds with the need to report on multiple accounting/tax/capital bases and the need to perform improved planning and forecasting. As a result, many insurers intend to rationalize model estates, with a particular drive for convergence in the US between disparate valuation and projection models. Consolidated models promote consistency from input through to output, lowering cost, streamline model maintenance, and help build more trust in the results being produced.

Overcoming barriers to model conversion

Despite some model consolidation over the last 5 years there is still further work for many insurers to consolidate and achieve their desired target state. To do this companies will need to overcome high model conversion costs, disruption to business processes, and upskilling resources in the new ways of working. For the new market entrants, their success depends on whether they are able to convince insurers that conversion efforts can deliver significant efficiencies and enhancements to the actuarial capabilities to support the business to deliver on its strategic objectives, as well as complying with the firm’s IT strategy and technology stack.

Market outlook

We believe the use of the cloud, innovative versioning, auditability and error handling are already taken for granted in actuarial models and insurers can expect actuarial modeling to harmonize with their overall tech strategy, architecture and modern tech capabilities — such as CI/CD (Continuous Integration and Continuous Delivery), code repository versioning and automated testing.

In addition, it is likely that parts of the core model and accompanying processes move to other technology over time. These could include low/no-code solutions that increase transparency of calculations, open source platforms that reduce reliance on single vendors, or parts of the models that can be processed independently or off-cycle (e.g. some decrement models). Finally, the technological backbone of the model may well vary significantly - certain model components may well run well on a GPU grid, a bank of cheap CPUs, or even utilizing quantum computing - the optimisation of cost/benefit of each of these will depend on the type of calculations, products, and other company specific details.

Holistic platform migrations are often effort-intensive multi-year projects, but also bring significant long-term benefits. A number of those undertaken to date have been tactical (e.g. as the result of mergers, the need to migrate off legacy platforms) but we believe the next wave of migration will be more strategic and designed to enable the company’s vision for 5-10 year’s time (e.g. more holistic, covering FP&A and capital management requirements). We believe actuarial model vendors will differentiate their software primarily by:

  • Demonstrating exceptional runtime performance through e.g., vectorization, GPU computing and scalable cloud environments
  • Providing effortless user experience and versatile models in universally understood interfaces
  • Enabling flexible ecosystem integration
  • Transparent pricing models

Performance

Regulations continue to become more complex and computationally demanding — e.g., nested stochastic modeling for US Stat PBR, or multi-step reporting runs and risk adjustment calculations for IFRS 17. This, coupled with the need to confidently project future financial positions, is a catalyst for more capable modeling solutions. At the same time, insurers continue to look for ways to reduce the length of the valuation cycle and the time required for analysis. Vendors are differentiating their model performance by:

  • Improving calculations through mechanisms such as vectorization and parallel processing
  • Employing the latest technologies such as GPU computing for modeling tasks that involve heavy computation and can be parallelized. We believe that GPU utilization by vendors will increase as they are architecturally enhanced for parallelizing certain calculations as compared to CPUs due to differences such as (i) thousands of smaller cores rather than fewer more powerful cores and (ii) higher-bandwidth and faster memory. Where software vendors can bridge the gap from specialized GPU programming languages to actuarial model developers and users, use of GPUs could become the new norm

There is not necessarily one effective solution to performance, but in order to compete with future runtime standards, we believe vendors could at least be offering some of these performance enablers that are now available.

Effortless user experience and model versatility

In addition to changing regulations, insurance products are generally becoming more tailored with more nuanced reinsurance structures and more intricate riders. Insurers should be able to consistently model their unique insurance and reinsurance contracts for a range of valuation, pricing and projection use cases. Companies do not want to rely on specific and potentially scarce skill sets for maintaining models. This requires platforms that are intuitive, versatile and based on widely used coding languages — i.e., where users can view and edit the model mechanics with a straight-forward interface — but without compromising controls, governance and auditability. Some newer vendors are also distinguishing themselves by allowing functionality for more automated model conversion from existing platforms. GenAI is also already being used to speed up code conversion between systems and may help to further reduce conversion efforts and timelines.

Additionally, actuarial modeling platforms could be expected to conform to standard IT deployment practices such as Continuous Integration and Continuous Delivery (CICD) and automated testing, and these can be used for both platform upgrades and modeling/coding changes. Integration with code repositories and check-in/check-out and support for multiple users simultaneously could be highly desired.

Flexible integration

Insurance companies do not want to have manual steps in their actuarial ecosystem and now they don’t need to. In our experience, workflow management solutions should be in place to reduce the need for users to go into model user interfaces to perform runs, ultimately empowering actuaries to focus on analyzing data and making informed decisions. Moreover, it is increasingly important that near real-time model output is available to enable the use of downstream financial dashboards. Subledgers and similar accounting rules engines should be updated automatically with actuarial output. This allows finance and tax teams to perform their functions earlier in the process, potentially reducing reporting timelines. Future finance transformation activity (above and beyond actuarial functions) would be easier as only a single actuarial model with one output format needs to be considered. All of this is driving the need for end-to-end integrated actuarial ecosystems.

As part of this, model vendors should enable seamless integration with upstream and downstream components of each insurer's choice. In our view, vendors of the future will not only offer native solutions, but also accommodate synthesis with a mix of data lakes, file transfer systems, data science programming languages, workflow management applications and reporting solutions. While stability and reliability should be expected, in our view, the end-to-end runtime can differentiate vendors — i.e., requiring strong performance around transfer speeds and bandwidth.

While actuarial models are at the center, they are still just one piece of a modernized ecosystem puzzle. If it is not possible to flexibly integrate the models with the broader end-to-end actuarial ecosystem, with the insurer’s choice of solution components, it may create implementation obstacles and even render improved model performance and experience futile.

Software licensing and costs

As the modeling capabilities expand, so do the vendor pricing options. Insurers may be paying a myriad of fees — per user, per run, per country, per ticket, per model change, per solution, etc. Among those costs, cloud hosting charges have grown as model run volume increased both in terms of model points and number of runs. However, many vendors offer and even practically require proprietary cloud environments for hosting models and charge these out at a premium above standard cloud costs. There is mounting pressure against this hosted cloud model, in particular due to the associated cost and the lack of easy integration into the insurer’s own cloud.

In our view, insurers prefer a simplified pricing structure i.e., per run or per use case. Other support such as the tooling suite, coding and technical support, including version upgrades and backwards compatibility tests, should be already built in and not separately priced for.

Conclusion

New technologies and emerging vendors are changing the actuarial modeling space. The increased competition in recent years will likely lead to a change in the relative market shares of vendors and lead to actuarial models catching up with the standards of other modern coding and technology solutions. There is a growing focus on (i) using next gen technologies to stay at the front of the curve in terms of run speed and stability; (ii) providing effortless user experience with well-controlled model versatility; and (iii) allowing seamless integration of actuarial models with an end-to-end financial reporting ecosystem.

Follow us