“…the insurance business is perhaps the purest example of an ‘information-based’ industry - that is, an industry whose sole activity consists of gathering, processing, and distributing information.” - Martin Campbell-Kelly, writing about the Prudential in the Victorian Era.1
The insurance industry: yesterday, today, and tomorrow
It might be odd to say that technology and its use in insurance is on a one-hundred-year cycle, but that seems to be the case.
130 years ago, actuaries crowded into a room at a meeting of the Actuarial Society of America to watch a demonstration that would revolutionize the industry: Herman Hollerith’s tabulating punch card machine2.
For the next half-century, the increasing automation — from tabulating machines to early-adopting mainframes and computers — was a critical competitive differentiator. Companies like Prudential, MetLife, and others partnered with technology companies in the development of hardware and software3.
The dramatic embodiment of this information-driven cycle was portrayed in the infamous Billion Dollar Bubble movie, which showcased the power and abstraction of the computer to commit millions of dollars of fraud by creating and maintaining fake insurance policies.
The movie also starts to hint at the oscillation away from the technological-competitive focus of insurance companies. I argue that the focus on technology was lost over the last 50 years with the rise of Wall Street finance, investment-oriented life insurance, industry consolidation, and the explosion of financial structuring like derivatives, reserve financing, or other advanced forms of reinsurance.
Value-add came from the C-Suite, not from the underlying business processes, operations, and analysis. The result is, e.g., ever-more complicated reinsurance treaties layered into mainframes and admin systems older than most of the actuaries interfacing with them.
The pace of strategic value-add isn’t slowing, though it must stretch further (in complexity and risk) to find comparable opportunities as the past. Having more agile, data-oriented operations enables companies to be able to react to and implement those opportunities. Technological value-add can improve a company’s bottom line through lower expenses and higher top-line growth, but often with a more favorable risk profile than some of the “strategic” opportunities.
Today, there is a trend reverting back to technological value-creation and is evident across many traditional sectors. Tesla claims that it’s a technology company; Amazon is the #1 product retailer because of its vehement focus on internal information sharing4; Airlines are so dependent on their systems that the skies become quieter on the rare occasion that their computers give way.
Why is it, that companies that are so involved in things (cars, shopping) and physical services (flights) are so much more focused on improving their technological operations than insurance companies whose very focus is ‘information-based’? The market has rewarded those who have prioritized their internal technological solutions.
Commoditized investing services and low yield environments have reduced insurance companies’ comparative advantage to “manage money”. Yield compression and the explosion of consumer-oriented investment services means a more competitive focus on the ability to manage the entire policy lifecycle efficiently (digitally), perform more real-time analysis of experience and risk management, and handle the growing product and regulatory complexity.
These are problems that have technological solutions and are waiting for insurance company adoption.
Companies that treat data like coordinates on a grid (spreadsheets) will get left behind. Two main hurdles have prevented technology companies from breaking into insurance:
- High regulatory barriers to entry, and
- Difficulty in selling complex insurance products without traditional distribution.
Once those two walls are breached, traditional insurance companies without a strong technology core will struggle to keep up. The key to thriving is not just adding “developers” to an organization; it’s going to be getting domain experts like actuaries to be an integral part of the technology transformation.
What’s coding got to do with this?
Everything. Programming is the optimal way to interact between the computer and actuary — and importantly between computer and computer. Programming is the actionable expression of ideas, math, analysis, and information. Think of programming as the 21st-century leap in the actuary’s toolkit, just as spreadsheets were in the preceding 40 years. Versus a spreadsheet-oriented workflow:
- More natural automation of, and between processes
- Better reproducibility
- Scaling to fit any size dataset and workload
- Statistics and machine learning capabilities
- Advanced visualizations to garner new views into your data
This list isn’t comprehensive and some benefits are subtle — when you are code-oriented instead of spreadsheet-oriented, you tend to want to structure your data in a portable and shareable way. For example, relying more on data warehouses instead of email attachments. This, in turn, enables data discovery and insights that otherwise wouldn’t be there. Investing in a code-oriented workflow is playing the long-game.
The actuary of the future needs to have coding as one of their core skills. Already today, the advances of business processes, insurance products, and financial ingenuity are written with lines of code — not spreadsheets. Not being able to code necessarily means that you are following what others are doing today.
It’s commonly accepted now that to gather insights from your data, you need to know how to code. Similar to your data, your business architecture, modeling needs, and product peculiarities are often better suited to customized solutions. Why stop at data science when learning how to solve problems with a computer?
The 10x Actuary
As we swing back to a technological focus, we do not leave the finance-driven complexity behind. The increasingly complex business needs will highlight a large productivity difference between an actuary who can code and one who can’t — simply because the former can react, create, synthesize, and model faster than the latter. From the efficiency of transforming administration extracts, summarizing and aggregating valuation output, to analyzing claims data in ways that spreadsheets simply can’t handle, you can become a “10x Actuary”5.
Flipping switches in a graphical user interface versus being able to build models is the difference between having a surface-level familiarity and having full command over the analysis and the concepts involved — with the flexibility to do what your software can’t.
Your current software might be able to perform the first layer of analysis but be at a loss when you want to visualize, perform sensitivity analysis, statistics, stochastic analysis, or process automation. Things that, when done programmatically, are often just a few lines of additional code.
Do I advocate dropping the license for your software vendor? No, not yet anyway. But the ability to supplement and break out of the modeling box has been an increasingly important part of most actuaries’ work.
Additionally, code-based solutions can leverage the entire-technology sector’s progress to solve problems that are hard otherwise: scalability, data workflows, integration across functional areas, version control and versioning, model change governance, reproducibility, and more.
30-40 years ago, there were no vendor-supplied modeling solutions and so you had no choice but to build models internally. This shifted with the advent of vendor-supplied modeling solutions. Today, it’s never been better for companies to leverage open source to support their custom modeling, risk analysis/monitoring, and reporting workflows.
Risk Governance
Code-based workflows are highly conducive to risk governance frameworks as well. If a modern software project has all of the following benefits, then why not a modern insurance product and associated processes?
- Access control and approval processes
- Version control, version management, and reproducibility
- Continuous testing and validation of results
- Open and transparent design
- Minimization of manual overrides, intervention, and opportunity for user error
- Automated trending analysis, system metrics, and summary statistics
- Continuously updated, integrated, and self-generating documentation
- Integration with other business processes through a formal boundary (e.g. via an API)
- Tools to manage collaboration in parallel and in sequence
Managing and Leading the Transformation
The ability to understand the concepts, capabilities, challenges, and lingo is not a dichotomy, it’s a spectrum. Most actuaries, even at fairly high levels, are still often involved in analytical work. Still above that, it’s difficult to lead something that you don’t understand.
Conversely, the skill and practice of coding enhances managerial capabilities. When you are really skilled at pulling apart a problem or process into its constituent parts and designing optimal solutions; that’s a core attribute of leadership: having the vision of where the organization should be instead of thinking about where it is now.
Nor is the skillset described here limiting in any other aspect of career development any more than mathematical ability, project collaboration, or financial acumen — just to name a few.
Outlook
It will increasingly be essential for companies to modernize to remain competitive. That modernization isn’t built with big black-box software packages; it will be with domain experts who can translate the expertise into new forms of analysis - doing it faster and more robustly than the competition.
SpaceX doesn’t just hire rocket scientists - they hire rocket scientists who code.
Be an actuary who codes.
Future Articles
The forthcoming series of articles will help illustrate what this can look like in practice: examining the business case, acquainting with ways of interacting with problems outside of spreadsheets and dataframes, and approaching work in a way that removes the boring parts and focuses on the concepts and insights.
The next article in this series will discuss what tools enable an actuary to deliver on the vision outlined in this article. In particular, it will highlight the Julia programming language, a tool well positioned to enable actuaries to develop the analysis, systems, and models of tomorrow.
Footnotes
Footnotes
Co-evolution of Information Processing Technology and Use: Interaction Between the Life Insurance and Tabulating Industries↩︎
Co-evolution of Information Processing Technology and Use: Interaction Between the Life Insurance and Tabulating Industries↩︎
From Tabulators to Early Computers in the U.S. Life Insurance Industry↩︎
Have you had your Bezos moment? What you can learn from Amazon↩︎