Watching as legislators, regulators, and policymakers' consider what changes, if any, are necessary for insurers' use of big data and algorithmic tools, is like being visited by the Ghosts of Christmas Past, Present, and Future. Insurance is all about data – data collected to determine whether to issue an insurance policy, how to service the policy, and whether to pay claims on the policy. However, as more data on the insured risk and more algorithmic tools become available, legislators, regulators, and policymakers seek to ensure that insurers do not become Ebenezer Scrooge.

Like a visit from the Ghost of Christmas Past, the National Association of Insurance Commissioners (NAIC) Big Data (EX) Working Group (Big Data WG) reviewed existing models for property and casualty insurance, some of which have been around for decades, to determine if revisions are needed to cover insurers' use of big data and algorithmic tools. Similarly, the U.S. Senate Committee on Banking, Housing, and Urban Affairs touched on this subject as part of its Examining the FinTech Landscape hearings, which included testimony from the U.S. Government Accountability Office on the regulation and oversight of alternative data use.

Recent visits from the Ghost of Christmas Present include the New York Department of Financial Services (NY DFS), which issued a 308 letter to insurers doing business in New York. As reported in our July 7 alert, the NY DFS is seeking information about the use of external consumer data or information sources in connection with accelerated or algorithmic underwriting programs that may supplement traditional medical underwriting. In addition, the BIG Data WG just concluded its 2017 Fall National Meeting where it discussed issues haunting consumers, industry, and regulators, including:

  • The consumers' rights to the data used, to be notified of the data used, and to correct the data used;
  • Data points that should not be used;
  • The level of correlation and/or causality necessary for data points be used; and
  • Additional regulation over data vendors.

NY DFS personnel have also played a role as the Ghost of Christmas Future, providing insight as to what the future might hold. While the NY DFS has not yet foretold and is not trying to stifle innovation, several apparitions are circling, including whether:

  • Using purchasing data is appropriate;
  • The data points used are predictive;
  • Consumers have been given adequate disclosure; and
  • Third party constructed insurance scores should be permitted.

While the regulators seek transparency, some academic spirits have warned that the goal of transparency in algorithmic tools may not be desirable as it may prevent society from fully using new technologies that could provide societal benefits. Unlike, Ebenezer Scrooge, our journey with the Ghost of Christmas Future has not yet ended. We may yet encounter a regulatory headstone or two before we wake to enjoy all the benefits that big data and algorithmic tools may bring.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.