While fintech companies have received considerable attention in recent years for using advanced data analytics techniques to improve and streamline credit underwriting, insurtech companies have also been active in improving their industry's underwriting methods. These businesses work with traditional insurers to improve risk assessment and pricing through automation, predictive analytics, and big data. However, insurtechs engaged in developing underwriting products should be aware of the compliance risks that may arise when receiving, combining, and evaluating personal and health-related information about individuals and providing reports, scores, and other guidance based on that information. In this article, we discuss two major federal laws governing these activities, the Fair Credit Reporting Act (FCRA) and the Health Insurance Portability and Accountability Act (HIPAA).
Despite its name, FCRA governs not only the use of consumer information in connection with eligibility decisions for credit, but for insurance as well. A "consumer report" under FCRA includes any communication of information by a "consumer reporting agency" bearing on an individual's creditworthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living that is used or expected to be used or collected for the purpose of serving as a factor in establishing the individual's eligibility for credit, insurance, employment or certain other purposes. A "consumer reporting agency" (CRA) is a company which, for compensation or on a cooperative basis, regularly engages in the practice of assembling or evaluating information about individuals for the purpose of furnishing consumer reports to third parties.
In developing data products and services to assist insurance companies in reviewing applications and determining pricing, an insurtech company may meet the definition of a CRA and become subject to FCRA's requirements. By sourcing large data sets of personal information, de-duping, cleaning, combining them, and producing various outputs through predictive models, an insurtech could be deemed to be "assembling" and "evaluating" information about individuals. If the report, score, or other communication that is produced is used or is expected to be used by an insurance company to determine an applicant's eligibility for insurance, the insurtech has created a "consumer report." If the insurtech regularly provides such consumer reports to one or more insurance companies for use in underwriting, it meets the definition of a CRA.
Under FCRA, CRAs are subject to a number of different compliance requirements. CRAs must have policies and procedures for ensuring they provide consumer reports only for a permissible purpose under FCRA and for ensuring the maximum possible accuracy of the information used in preparing consumer reports. In addition, a CRA is required to develop a significant number of consumer-facing functions, including policies and procedures to block information identified as resulting from identity theft, placing alerts or freezes on a consumer's file upon request, providing consumers access to information about them in the CRA's databases, and investigating and resolving disputes consumers may raise over that information. Violations of FCRA are subject to significant civil penalties, and the law provides for a private right of action.
Although FCRA's coverage is broad, there are various options available to insurtech companies to keep their activities from meeting the definition of a CRA. For example, prior regulatory guidance has stated that "information that does not identify a specific consumer" (e.g., group or properly anonymized data) does not constitute a "consumer report," even if used to determine eligibility. In addition, past guidance has stated that a company is not "assembling or evaluating" information, and is therefore not a CRA, where its activity is limited to selling a software package that enables the purchaser to produce a score or other analysis by itself.
Insurtech companies active in the underwriting space should carefully consider whether to comply with FCRA as a CRA, or to develop products and services that will not bring them within the scope of the law.
Health insurance plans may use individually identifiable health information other than genetic information, known as "protected health information" (PHI) under HIPAA, for underwriting services. They may use insurtech companies to help them with that underwriting. Under those circumstances, the insurtech companies are also subject to HIPAA and referred to as "business associates" of those health insurance plans. As part of the underwriting process, it is possible that the health insurance plans or the insurtech companies helping them with that underwriting may pull into the process social determinants of health, such as purchasing habits and whether the individual files income taxes and votes.
Importantly, however, there are nondiscrimination rules that prevent health insurance plans from using health status when setting premiums for a group health plan—individuals in the group cannot be charged different amounts depending upon their health status. In the individual market, the Affordable Care Act prohibits health insurance carriers from considering any factors other than location, age, tobacco use, plan category, and dependent coverage in setting premiums for individuals.
Insurtech companies helping these health insurance plans may obtain the right from the plans to deidentify the PHI and to use the deidentified data for their own purposes. HIPAA provides two methods by which to conduct the deidentification: (1) a safe harbor, applicable when 18 identifiers, such as name, address, and date of birth, are removed and there is no actual knowledge that the information could be used alone or in combination with other information to identify an individual who was the subject of the information or (2) an expert determines that the risk is very small that the information could be used, alone or in combination with other reasonably available information, by an anticipated recipient to identify an individual who is the subject of the information. In addition, insurtech companies should confirm that the resulting deidentified data sets also comply with the deidentification requirements under the California Consumer Privacy Act and under other evolving state privacy laws. Insurtech companies can use this deidentified data to improve their automation and predictive modeling.
* * * * *
When supporting health insurance plans, insurtech companies need to remain aware of FCRA and HIPAA compliance. For more information on FCRA and HIPAA and their impact on the insurtech industry, and assistance with establishing FCRA and HIPAA compliance programs, please contact any of the authors.