Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group
In this issue, we highlight the Senate Commerce Committee's Paper Hearing on Big Data and the Coronavirus and also explore letters written by Senators related to video conferencing. We examine the Consumer Financial Protection Bureau (CFPB) Taskforce's request for information to assist the Taskforce on Federal Consumer Financial Law and the Federal Trade Commission's announcement of its Data Portability Workshop and its request for comment. In the states, we discuss Washington's Facial Recognition Bill and State Attorneys General encouragement for the CFPB to withdraw its Credit Reporting Enforcement Statement. Across the pond, we address the European Data Protection Supervisor's March 2020 Newsletter.
Heard on the Hill
Senate Commerce Committee Holds Paper Hearing on Big Data and the Coronavirus
The Senate Committee on Commerce, Science, and Transportation (Committee) held a paper hearing entitled "Enlisting Big Data in the Fight Against Coronavirus." According to the Committee's website, the Committee developed a "paper hearings" process "to continue investigative and oversight responsibilities while adhering to public health guidelines during the coronavirus pandemic." From a procedural perspective, the Committee website explains that first, at the start of the paper hearing, the Committee will post the Chairman's and Ranking Member's opening statements and witness testimony. Next, member questions are sent to witnesses by close of business on the day of the paper hearing, and then witnesses have 96-business-hours to answer member questions. The Committee posts the questions and witness responses on the Committee's website after they are received and will provide an official transcript for the record. Additionally, all paper hearing documents are made available online at the hearing website.
Following the procedure described above, opening statements by Chairman Roger Wicker (R-MS) and Ranking Member Maria Cantwell (D-WA) were published on April 9, 2020, and the paper hearing concluded on April 15, 2020, with the publication of written responses from witnesses to 362 questions across 100 pages from 24 Committee Members. Witnesses included individuals from various trade associations, think tanks, a company that sells "smart" devices, and an academic institution.
Among other data privacy topics, participants addressed the following: (1) development of federal privacy legislation, for which each witness indicated support; (2) specific privacy law proposals; (3) Privacy for America's new paradigm for privacy legislation; (4) the Health Insurance Portability and Accountability Act (HIPAA); (5) accountability for contact tracing apps implemented to combat COVID-19; (6) government entities' data use; (7) balancing privacy and COVID-19 tracking; (8) anonymized, aggregated, and de-identified data; (9) companies' data practices; (10) online advertising; (11) data purpose limitations; (12) how data affects underrepresented communities; (13) data security; (14) specific data controls; and (15) children's privacy.
During opening statements, Chairman Wicker stated that "big data" may help efforts to slow the spread and mitigate the impact of COVID-19. He added that location data is particularly sensitive and noted that the conversation around using data to combat COVID-19 reflects the need for a federal data privacy law. Ranking Member Cantwell emphasized the importance of ensuring consumer privacy protections while expressing support for using data to aid with public health messaging and research.
During questioning, Committee Members and witnesses expressed support for federal data privacy legislation. Witnesses discussed types of data, privacy practices, and accountability measures that they believe should be considered in the government's efforts to use data to fight the COVID-19 pandemic. Committee Members and witnesses also noted new issues that have arisen during the pandemic, such as student privacy in distance learning and relaxed HIPAA requirements. Several Committee Members and witnesses emphasized that effectiveness against the coronavirus is not incompatible with privacy
On May 7, 2020, Chairman Wicker and Senators John Thune (R-SD), Deb Fischer (R-NE), Jerry Moran (R-KA), and Marsha Blackburn (R-TN) announced the formal introduction of the COVID-19 Consumer Data Protection Act. The bill would direct subject companies to inform consumers how "their data" is handled, retained, and shared and, among other provisions, would require that the subject companies: (1) enable consumers to opt-out of the collection, processing, and sharing of health and location data; (2) provide "transparency reports" to the public describing their data practices with respect to COVID-19 mitigation; and (3) delete or "de-identify" personally identifiable information ("PII") after such data "is no longer being used" for COVID-19 mitigation efforts. The press release also stated that the bill will establish data security and minimization standards for PII collected by subject companies as well as definitions for "aggregated" and "de-identified" data.
Senators Urge Company and FTC to Ensure That Privacy is Protected on Video Conferencing Apps
In recent weeks, Senate Committee on Commerce, Science, and Transportation (Senate Commerce Committee) Members Richard Blumenthal (D-CT) and Ed Markey (D-MA) have voiced concern about video conferencing apps' data practices as a result of individuals' increased use of such services during the COVID-19 pandemic.
On April 8, 2020, Sen. Markey sent a letter to the five current Federal Trade Commission (FTC) Commissioners urging them to develop privacy and data security guidelines for providers of video conferencing services. Citing individuals' growing use of video conferencing services, Sen. Markey requested that potential FTC video conferencing guidelines include: (1) limits on data collection; (2) guidance on authentication and data security; and (3) guidance on privacy policies for users of video conferencing apps, among other topics. Sen. Markey also encouraged the FTC to develop best practices for individuals using video conferencing apps and to investigate various conferencing apps' privacy and security practices.
Around the Agencies
CFPB Announces Request for Information on Consumer Financial Services Markets
On March 27, 2020, the Consumer Financial Protection Bureau (CFPB) issued a request for information (RFI) seeking comments from the public on which areas of the consumer financial services markets function well (i.e., areas that are fair, transparent, and competitive) and which areas may benefit from regulatory changes to facilitate competition and increase consumer welfare. The RFI states that comments will assist the Taskforce on Federal Consumer Financial Law (Taskforce), which is an independent body within the CFPB reporting to the Director of the CFPB. Comments are due sixty (60) days after the date of the RFI's publication in the Federal Register.
According to the RFI, the CFPB particularly seeks comments regarding certain markets, such as automobile financing, credit cards, consumer reporting, debt settlement, electronic payments, money transfers, and student loans. Among other topics for comment, the RFI lists twenty-three (23) multi-part questions. The questions are broken into five categories: (1) expanding access to financial services and obstacles to inclusion; (2) protection and use of consumer data; (3) regulations the CFPB writes and enforces; (4) costs and benefits of overlapping enforcement authorities between federal and state agencies; and (5) improving consumer protection. Below are samples of question topics from each category.
- What role should the CFPB play in regulating the use of "alternative" data for credit underwriting?
- Are the protections in the Gramm-Leach-Bliley Act and its implementing Regulation P, and the Fair Credit Reporting Act (FCRA) and its implementing Regulation V, sufficient to protect consumer personal information?
- Are the provisions in the FCRA and its implementing Regulation V sufficiently designed to ensure the accuracy of consumer report data?
- Is it desirable to have federal legislation, regulation or guidance, or uniform national standards on data breaches and its obligations?
- How can the CFPB or Congress balance facilitating innovation by financial technology companies with consumer protection?
- Are there places where regulations have not kept up with changes in consumer financial services markets?
Federal and State Coordination
- Should there be changes to the current model of shared jurisdiction over financial institutions?
- What are the costs and benefits to consumers and financial institutions of overlapping enforcement powers under state and federal law?
Improving Consumer Protection
- Which markets for consumer financial products or services are functioning well (i.e., fair, transparent, and competitive)?
- How should the CFPB determine appropriate remedies for legal violations, considering the need for correction and deterrence without creating adverse or unintended effects?
Instructions for submitting comments including supporting materials appear in the CFPB's RFI. The RFI provides that all submitted comments will be posted without change and will be available for public inspection including supporting materials.
FTC Schedules Data Portability Workshop; Issues Request for Comment
The Federal Trade Commission (FTC) is set to analyze the potential benefits and challenges associated with data portability by hosting a public workshop on September 22, 2020. The workshop seeks to bring together industry representatives, economists, consumer advocates, regulators, and other stakeholders for a public discussion on issues raised by data portability.
In its announcement of the workshop, the FTC noted that data portability gives consumers more control over data related to them and allows consumers to move data from one service to another or to themselves. According to the FTC, data portability may also promote competition by allowing new entrants to the market to access data that would otherwise be unavailable to them.
Noting that the topic of data portability has "gained interest" with the implementation of the European Union's General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), the FTC seeks to address various questions related to data portability during the workshop. Specifically, the workshop will address the potential benefits to consumers and competition of data portability; the potential risks to consumer privacy and how those risks might be mitigated; the potential impact of mandatory data access or data sharing on companies' incentives to innovate; how to best ensure the security of personal data that is being transmitted from one business to another; and the merits and challenges of interoperability and who should be responsible for ensuring interoperability.
In anticipation of the workshop, the FTC is seeking comment on a range of issues, including:
- How companies are currently implementing data portability
- The benefits and costs of data portability
- The extent to which data portability has increased or decreased competition
- If data portability works better in some contexts than others, as well as if data portability works better for particular types of information
- Who should be responsible for the security of personal data in transit between businesses
- How companies verify the identities of requesting consumers
- The lessons learned and best practices from the implementation of data portability requirements established in the GDPR and the CCPA
Comments may be submitted until August 21, 2020, to DataPortability@ftc.gov or by mail.
In the States
Washington State Enacts Facial Recognition Bill
On March 31, 2020, Governor Jay Inslee signed a bill into law that will govern state and local government agencies' use of "facial recognition services." Gov. Inslee vetoed one section of the bill, which would have mandated the establishment of a task force to consider issues related to government use of facial recognition technology and provide recommendations to address these issues. Gov. Inslee stated that this task force, while important, was not funded in the state's budget. He approved the rest of the bill.
The new law imposes reporting and testing requirements on any state or local government agencies that use "facial recognition services," defined as "technology that analyzes facial features and is used by a state or local government agency for the identification, verification, or persistent tracking of individuals in still or video images." Under the law, any Washington state or local government agency planning to use facial recognition services must file a notice of intent with a legislative authority and follow up with an accountability report, including the name and general capabilities and limitations of the service to be used, the data used by the service, the purpose for using the service, a data management plan, the agency's testing procedures, error rates, potential impacts on civil rights and liberties, and the agency's procedures for receiving feedback. The law also sets forth requirements for a comment period and other procedures that an agency must complete prior to finalizing its accountability report.
The law also requires any agency that will use a facial recognition service to make decisions that will have a legal or similarly significant effect to "ensure those decisions are subject to a meaningful human review[,]" to test the service under "operational conditions[,]" and to "take reasonable steps to ensure best quality results" by following guidance from the service's developer.
Agencies must require facial recognition service providers to allow the agency to test the accuracy and fairness of the service via an application programming interface (API) or similar technology. Agencies must also maintain records regarding their use of the service and provide periodic training for any individual who operates a facial recognition service or processes data obtained from it.
For criminal defendants, agencies must disclose the fact that the facial recognition services were used "in a timely manner prior to trial." In the same vein, Judges who issue warrants for the use of facial recognition services must make a report to the administrator for the courts regarding the application for the warrant and the judge's decision regarding the warrant.
Among other restrictions, the law prohibits the use of facial recognition services absent a warrant, exigent circumstances, or a court order issued in limited circumstances. The law also prohibits use of facial recognition services based on an individual's religion, race, or other protected characteristics or to create a record of an individual's exercise of their First Amendment rights.
The law includes limited exceptions, including use of facial recognition services under a federal mandate or in association with a federal agency to identify traveler identities at airports and seaports.
State AGs Urge CFPB to Reverse Statement of Intent to Ease FCRA Enforcement During COVID-19
Attorneys General from 21 states, the District of Columbia, and Puerto Rico (State AGs) have issued a letter to the Director of the CFPB requesting that the CFPB reverse its recently stated intent to ease enforcement of certain aspects of FCRA. Specifically, the State AGs oppose the CFPB's April 1, 2020 Statement on Supervisory and Enforcement Practices Regarding the Fair Credit Reporting Act and Regulation V in Light of the CARES Act (Statement), in which the CFPB states that it does not intend to bring enforcement actions (1) against entities providing information to CRAs for failure to comply with recent amendments to FCRA (passed under the Coronavirus Aid, Relief, and Economic Security Act (CARES Act)) that require these companies to report as current credit obligations for which payment accommodations were made to consumers affected by COVID-19; and (2) against CRAs that take longer than the required 30 days to investigate consumer credit disputes. In explaining its decision to limit enforcement of FCRA amendments, the CFPB states that it "expects furnishers to comply with the CARES Act" and "supports furnishers' voluntary efforts to provide payment relief." With respect to its decision to limit enforcement of FCRA's timely investigation requirement, the CFPB cites the COVID-19 crisis and concerns that "CRAs and furnishers may experience significant reductions in staff, difficulty intaking disputes, or lack of access to necessary information" to investigate and resolve disputes.
In opposing the CFPB's stated intent to limit enforcement of FCRA amendments under the CARES Act, the State AGs express concern that consumers will be "discourage[d] from taking advantage of the forbearances and other accommodations that lenders are offering" in response to economic hardships resulting from the COVID-19 pandemic. The State AGs also opine that the CFPB's enforcement position means that "[l]enders that work to comply with the CARES Act will be at a competitive disadvantage to those that flout its furnishing requirements, harming both honest businesses and consumers." The State AGs further take exception to the CFPB's easing of enforcement of FCRA's timely credit dispute investigation requirements, stating "[a]t a time of significant economic uncertainty, it is incumbent upon both CFPB and the CRAs to be even more vigilant in ensuring that American consumers are protected against false and incorrect information on their credit reports[.]" As additional impetus for robust FCRA enforcement, the State AGs cite "the ubiquity of COVID-19 (and stimulus) scams that are quickly increasing in prevalence," noting "thousands of complaints [to federal bodies] relating to phishing and other scams designed to gather sensitive personal and financial information."
Ultimately, the State AGs note that the COVID-19 pandemic requires more robust FCRA enforcement against CRAs, not less, stating "[n]ow is not the time to let them fall asleep at the switch."
European Data Protection Supervisor Issues March 2020 Newsletter
On March 25, 2020, the European Data Protection Supervisor (EDPS), the independent data protection authority for the European Union (EU), released its monthly newsletter (Newsletter). Each newsletter provides updates on EDPS activities over the previous month. The March Newsletter covered several topics, including (1) the EDPS 2019 Annual Report; (2) a data protection guide to using photo booths; (3) Artificial Intelligence (AI) and Facial Recognition; and (4) an EDPS opinion on the opening of negotiations for a new partnership with the United Kingdom (UK).
- Annual Report. The EDPS 2019 Annual Report (Report) was issued on March 18, 2020. The Report provides insight into EDPS activities over the previous year. In particular, the EDPS discussed its efforts to ensure new EU rules on data protection are put into practice, particularly within EU institutions themselves, by conducting training and issuing guidelines.
- Photo booth Guidance. The EDPS issued guidance on the use of photo booths (Guidance). The EDPS explained that photo booths process personal information when they take images because the images can be used to identify individuals. The Guidance is intended to help data controllers identify the rules with which they need to comply when using photo booths.
- Workshop on AI and Facial Recognition. On February 13, 2020, the EDPS organized a workshop (Workshop) to discuss how to improve machine learning algorithms in AI and how those algorithms interact with Facial Recognition technologies. The Workshop brought together researchers, practitioners, regulators, and others to share their experiences and evaluate possible legal, political, and regulatory responses. The discussions from the Workshop will help EDPS plan future activities related to AI and Facial Recognition.
- Opinion on Partnership with the UK. On February 24, 2020, the EDPS issued an opinion on the opening of negotiations for a new partnership with the UK (Opinion) pursuant to an EU Commission recommendation regarding the same. According to the Opinion, the EDPS supports developing a partnership with the UK to ensure that EU data protection rules are respected, and personal data is protected once the Brexit transition period ends in December 2020. In the Opinion, EDPS outlined several conditions that would need to be met for a partnership with the UK, including (1) a shared commitment to respect fundamental rights; (2) the prioritization of matters that can be handled by public authorities rather than law enforcement; and (3) an evaluation of the onward transfer of personal data.