Telematics Data – Changing The Insurance Underwriting and Actuarial Environment

Telematics and specifically the usage based data it generates, significantly improves the ability to rate and price automobile insurance, by adding a deeper level of granularity to the data commonly used today.

Companies in the forefront of using telematics data, are beginning to understand the value of its many indicators as they relate to policyholder driving behavior, and how that behavior positive or negative, directly affects overall policy administration cost.

This advantage though, also comes with a possible disadvantage – higher volumes of data being added to already burdened processing resources. A single vehicle generates approximately 2.6 MB of data per week.  If 50,000 auto policies are on the books, accumulating that data for a year results in 6.8 TB per year.

Pay How You Drive Data

Given that the use of telematics data from automobiles is on the rise in insurance companies, to be followed by telematics data generated from wireless sensors in personal and commercial use; a solution for processing huge volumes of data quickly is indicated.

Most likely that solution is SAP/HANA based, processes the data and analytics together in main-memory, provides underwriters and actuaries a technological advantage to their business – real-time rating and pricing, a solution that doesn’t exist with traditional methods.

Jim Janavich

Tweet about this on TwitterShare on LinkedInShare on Google+Share on TumblrEmail this to someone

The Big Data Challenge – What’s Your Point?

Recently I attended a seminar at MIT’s Enterprise Forum on data and analytics, Big Data, in the pharmaceutical and health care industries and I learned a thing or two.  The panel included an investor, and leaders in research and IT from Pfizer, Astra Zeneca and a joint effort between Harvard University and MIT called The Broad Institute.

The moderator described how there are a number of success stories that relate to the harvesting of previously un-known or managed data.  Still, there are many technical, human, and organizational challenges to widespread success. Unwisely, in his view, many, instead of participating, are sitting on the sidelines waiting for the clear path to be sorted out. In a bit of hyperbole he said, “If you don’t like change, you’re going to really hate being irrelevant.”

Everyone agreed that we are in the early days of big data and related analytics.  Whatever we think of the volume, velocity and variety of the data we’re dealing with, our knowledge regarding what it is and what to do is in its infancy.  That said, in the past few years of trying the panelists have learned to distinguish between the technical issues of data volume and velocity and the human capital issue of data variety.  They believe that the large, constantly changing data universe will be increasingly manageable as our technologies try and catch up.

An Industry Note

Within the Pharmaceutical and Healthcare spheres per se there are challenges involving creating knowledge/data about drugs or genomes and the willingness or others to pay for access to that knowledge, which comes at a cost. This is especially true with genomic testing. Normally one pays each time you run a blood test or a CAT scan… each test is different and the analysis relates to that test. With genomes, how do we create policy around this data where you test once and the data are used and viewed many times by others?

The variety of data is something that must be dealt with by people. It comes in different forms (one example: structured v unstructured) from different sources, some from the analysis you just invented, and the uses and potentials are constantly changing. The panelists believe that our ability to understand, examine and use the variety of data is limited mostly by human skills, insight, experience and knowledge.

There was an extended discussion about the volume of data that got me to thinking.  It is agreed by all that we have more data available than we know what to do with.  And, each time we do an analysis we create more data.  The data volumes are increasing faster than our abilities to store, manage, inquire, and analyze the volumes.  The data volumes are beyond our ability to cope and are growing faster than our abilities grow.

For all practical purposes this means that data volumes are infinite.  Whatever our skill and technology scope the volume of data exceeds it today and will do so for some time.  We have to keep trying to catch up but understanding and analyzing all of our data will never be a productive goal.

The strategic differentiation in analytics will come from what my colleague Allan Frank describes as “answering outcome-based questions.” In the context of the panel’s observations, the skills and insights needed to address big data may well include technical data scientists and writers of algorithms and more. But, success will certainly hinge on the ability to distill what business outcomes you want, why, and what you need to know in order to service those outcomes. Our friend Bruce Rogow puts it perhaps more emphatically.  He associates success with “defining your purpose.”

If you want to have strategic success in the area of big data an analytics we recommend some familiar frameworks applied to this space:

  1. Whether you’re responsible for a small business unit or for an enterprise, understand your business vision.  If is already prepared, get a copy.  Break it down into the strategic vectors and “do-wells” or if you prefer, your critical success factors, and describe the business capabilities needed to succeed and the technology ecosystem, in this case data and analytical ecosystem, needed to support them.
  2. Start organizing and iterating 6-12 week cycles that scaled agile world calls “release trains.”  Have a subset of the business narratives and related segment of the ecosystem taken to the next level, designed and built.  At the end of this cycle you have a working environment that examines real data, produces real results.
  3. Determine what about the effort was successful, what needed help or more data or more analysis or a better defined business purpose.  Define another analytical release train. Do it again.

Doug Brockway

Tweet about this on TwitterShare on LinkedInShare on Google+Share on TumblrEmail this to someone