June 2021

The Download

21 min

Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group

In this issue, we highlight the recent developments in state privacy legislation and the patchwork requirements emerging across the country. We tackle the latest on biometric laws and examine recent legislation and enforcement. In international developments, we discuss current efforts to provide legal mechanisms for cross-border data transfers. We conclude with a look at the renewed interest in expanding laws related to children and teen privacy.

Venable’s eCommerce, Privacy, and Cybersecurity Group Named Chambers USA’s Privacy and Data Security Law Firm of the Year

Venable LLP is pleased to announce that it has been named Chambers USA’s Privacy and Data Security Law Firm of the Year. This award recognizes a law firm’s achievements over the past 12 months, including outstanding work, impressive strategic growth, and excellence in client service. Venable was honored during a virtual ceremony on May 27, 2021. This is the second time the group has received the award.

As States Consider and Pass Privacy Legislation, No Single Model Emerges

A growing number of states have recently considered or passed privacy legislation. While some common themes have emerged, many pieces of such legislation contemplate different, and sometimes conflicting, methods of providing rights and regulating entities. The state bills include varying approaches to consumer rights; types of permissions required to collect, process, and share personal information; and enforcement and regulatory mechanisms, to name a few of the more common divergent components. Below follows a discussion of differences across existing state privacy laws, key themes in state privacy bills considered by state legislatures in 2021, and prospects for the passage of federal privacy legislation in light of complex privacy requirements developing in the states.

I. Existing State Privacy Laws

In 2018 California became the first state to enact a general privacy law, known as the California Consumer Privacy Act (CCPA). The following year, two more states followed in California’s footsteps, with Maine and Nevada each passing privacy legislation. In 2020, several states introduced privacy bills, but such efforts were largely sidelined by the COVID-19 pandemic. The exception to this general trend was in California, where voters approved Proposition 24 during the November 2020 general election. Proposition 24, also known as the California Privacy Rights Act (CPRA), materially amends the CCPA and establishes a new state agency to regulate and bring enforcement actions related to data privacy in California. Now, in 2021, at least 23 states have refocused their attention on considering privacy legislation. Virginia succeeded in passing a new omnibus data privacy law, the Virginia Consumer Data Protection Act (CDPA), on March 2, 2021, while Nevada passed SB 260, which amends the state’s existing privacy statute, on June 2, 2021.

Although certain high-level similarities are present in existing state privacy laws, specific obligations under those laws differ significantly. For example, a general similarity between CCPA, CPRA, and CDPA is that each law takes a rights-based approach to privacy legislation, meaning that the laws provide state residents with the ability to exercise certain rights with respect to the data that “businesses” or “controllers” maintain about them. Among other rights, the CCPA, CPRA, and CDPA enable consumers to access, delete, and opt out of certain transfers of “personal information” or “personal data.” However, key details of each of those rights, as well as many other facets of the laws, are inconsistent.

For instance, the CCPA provides Californians with a right to opt out of sales of personal information, and “sale” is defined to mean a transfer of personal information for monetary or other valuable consideration. The CPRA extends the CCPA’s opt-out right to also apply to “sharing” personal information, which is defined as any transfer of personal information for cross-context behavioral advertising purposes. The CDPA gives Virginians the right to opt out of sales of personal data, “targeted advertising,” and “profiling,” as defined. However, the CDPA’s definition of “sale” differs from the definition of the same term under California law. Virginia defines “sale” to mean transfers of personal data for monetary consideration only, and Nevada’s bill to amend the state’s existing privacy law brings its definition of “sale” more in line with Virginia’s definition. The inconsistency in opt-out rights across California, Virginia, and Nevada is just one example of a variation across existing state privacy laws.

II. State Privacy Bills Considered in 2021

Differences across state privacy laws could become even more pronounced if additional states pass privacy legislation. For example, the privacy laws in California and Virginia generally take an opt-out approach to transfers of data associated with consumers. Similarly, Colorado SB 21-190, which has passed both chambers of Colorado’s legislature and now awaits the governor’s signature, would give consumers the right to opt out of the sale of personal data, targeted advertising, and profiling. However, bills like New York SB 6701 would have required opt-in consent for any processing of personal data associated with a New Yorker. New York is not the only state that considered an opt-in bill this year. Hawaii nearly enacted a bill that would have required opt-in consent for disclosures of “internet browser information” and “geolocation information,” as defined. Additionally, the Oklahoma legislature considered a bill that would have required opt-in consent for any collection of personal information from a consumer.

Another key difference among the many state privacy proposals considered in 2021 centered around the approach to enforcement. For example, a number of states, including New York, Florida, Washington, New Jersey, and others, considered legislation that would provide a broad private right of action for any violation. Such a broad private right of action differs from the enforcement approach taken in existing state privacy laws. Under the CCPA and CPRA, state agencies are empowered to bring enforcement actions for violations of the laws’ provisions, and only a limited private right of action related to certain data breaches is available. Similarly, the VCDPA and Colorado’s bill vest enforcement responsibility with the state attorney general alone. In some states in 2021, however, the issue of enforcement became a sticking point that ultimately resulted in privacy bills failing to pass. Washington state, for example, failed to pass the privacy legislation under Washington SB 5062 for the third year in a row because of disagreement between the state House and Senate over the appropriate enforcement approach.

A myriad of other variables exist in state bills as well. For instance, Texas HB 3741 proposed to sort data elements into certain categories (“category one,” “category two,” and “category three”), and the bill would have placed different requirements on entities, depending on which category of information is being processed. New York SB 6701 would have introduced the concept of fiduciary duties into the privacy regulation world by requiring controllers, processors, and third parties that process personal data associated with New Yorkers to take on duties of care and loyalty to consumers. Other bills, like Florida HB 969, would have added new rules regarding data retention to the mix. And bills in Montana and New Hampshire, if enacted, would have placed limits on the transfer and processing of location information absent opt-in consent. These are only a few examples of variations in privacy legislation approaches that have been proposed by state legislatures during the 2021 general session.

III. Prospects for Passage of Federal Privacy Legislation

The specific requirements of existing state privacy laws and pending bills are complex and constantly evolving, and as a result, the business community has expressed concern that a patchwork of privacy requirements is developing in the states. Without guidance at the federal level, states have explored different frameworks for privacy rights, including different approaches to enforcement of such rights. Uncertainty surrounding obligations has ensued, increasing pressure on federal legislators to pass a privacy law preempting state legislation. In response to the growing call for preemptive federal privacy legislation from the business community, a group of trade associations, companies, and organizations came together to form Privacy For America, a coalition that advocates for a framework for federal privacy legislation that would clearly define prohibited data practices that make personal data vulnerable to breach or misuse, while preserving the benefits that come from responsible use of data. With Privacy for America setting the stage by outlining a paradigmatic federal privacy framework, and with the states putting pressure on legislators in Washington, DC by considering and passing divergent privacy laws, members of the business community and consumer advocates alike believe the time is ripe for Congress to act.

Biometric Laws Take Center Stage

In 2008 Illinois became the first state to pass legislation regulating the collection, use, and disclosure of biometric data when it enacted the Illinois Biometric Information Privacy Act (BIPA). Texas and Washington followed soon after by passing their own versions of biometric laws in 2009 and 2017, respectively. The BIPA has garnered the most attention because of its private right of action, which has led to hundreds of lawsuits and at least one class action settlement resulting in a settlement of $650 million against a large social media platform, but the attorneys general of Texas and Washington continue to monitor compliance with their states’ laws as well. While these three state laws are the current standard for biometric information in the United States, below we discuss several developments in the area, including new proposed state laws, federal enforcement, and international considerations in this rapidly evolving landscape.

In early 2021, New York and Maryland introduced new legislation to join Illinois, Texas, and Washington as states where the collection, use, and disclosure of biometric data would be regulated by specific biometric data laws. While neither proposal has yet to be enacted, both bills are based on BIPA. The bills would place requirements on the collection, use, and disclosure of “biometric identifiers,” which are defined as data such as face scans, retina scans, handprints, and other, similar data used to authenticate or identify an individual. Both bills would require a private entity to develop a publicly available policy regarding the collection, use, and retention of biometric identifiers. Such a policy would be required to establish a destruction timeline for covered information that would be the earlier of either the time when the biometric identifier is no longer needed for the purpose for which it was collected, or within three years of the last interaction an individual had with the private entity. Both bills would also require that any disclosure of a biometric identifier would take place only after obtaining the written authorization of the individual or an authorized representative, but also bar the sale of covered information for any profit. Importantly, both bills would create private rights of action similar to the one established in Illinois. The bills would allow any individual aggrieved by a violation to bring a civil action for the greater of $1,000 or actual damages, or $5,000 for reckless violations.

Other states have taken a different approach to the regulation of biometric information. South Carolina proposed the South Carolina Biometric Data Privacy Act in 2021. This bill is still under consideration, but would broadly define covered biometric information to include both the data defined in the BIPA, but also data such as “keystroke patterns of rhythms, gait patterns or rhythms, and sleep, health, exercise data, or geolocation data that contain identifying information.” This bill would also require consent for the collection, use, and disclosure of covered information and would create a right of deletion instead of a mandated three-year retention period, similar to BIPA. Additionally, the South Carolina bill would allow the sale of covered information subject to an opt-out right (although sales of covered information about individuals known to be under 16 years of age would be prohibited). The bill would include a private right of action, except that a reckless violation would be subject to $10,000 penalties instead of $5,000.

In addition to these biometric-specific proposals, two states passed new comprehensive privacy bills that regulate biometric information. California passed the California Privacy Rights Act (CPRA) via ballot initiative in 2020 and Virginia passed the Consumer Data Protection Act (CDPA) in early 2021. Both laws take effect on January 1, 2023 and create new requirements for the collection, use, and disclosure of biometric information. The CPRA defines the “processing of biometric information for the purpose of uniquely identifying a consumer” as “sensitive personal information.” Such information is subject not only to the CPRA’s general rights of notice, access, deletion, and correction, but also to a consumer’s request to limit the use and disclosure of sensitive personal information to a subset of authorized uses, none of which would allow its use in most advertising or marketing uses. The CDPA, on the other hand, would require consumers to opt in to the “processing” of sensitive data, which includes the collection, use, maintenance, and disclosure of biometric information. This means that businesses covered by the CDPA will be required to obtain consent for biometric data starting in 2023. Importantly, neither the CPRA nor the CDPA creates a general private right of action, relying on government entities to enforce the laws.

On the federal front, the Federal Trade Commission (FTC) announced a settlement with a company, Everalbum, Inc., that operates an online photo album service related to that company’s use of biometric information. The FTC alleged that the company misled users by implying that there would be no facial recognition applied to photos stored in the app without the user’s affirmative permission, but instead automatically turned on the feature for users in all but three states (Illinois, Texas, and Washington) and users in the European Union. The FTC alleged that this activity violated Section 5 of the FTC Act’s prohibition on unfair or deceptive practices. The FTC’s settlement requires the company to obtain affirmative consent to the use of facial recognition technology on a user’s photos. This represents the FTC’s first settlement related to facial recognition and shows that the FTC will continue to exercise its authority to prevent alleged unfair or deceptive practices, even when the technology underlying those practices evolves over time.

Finally, on April 4, 2021, the European Union published a proposed regulation to harmonize the regulation of artificial intelligence (AI) across its member countries. While the regulation would cover more than just biometric information, it includes proposed provisions related to the use of biometric identifiers. The proposed regulation would prohibit the use of “real-time” remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement. This would include the use of such real-time technology for facial recognition like closed-circuit television systems. The law includes various exceptions for law enforcement’s use of real-time biometric identification systems. These proposed regulations would supplement the requirements for the collection, use, and disclosure of biometric information already embodied in the General Data Protection Regulation.

Given the continued development of state, federal, and international regulations and enforcement actions related to biometric data, organizations should continue to map, assess, and monitor their activity related to such data.

EU Cross-Border Data Transfers After Schrems II

In light of last year’s decision by the Court of Justice of the European Union’s (CJEU) in Data Protection Commission v. Facebook Ireland, Schrems (Schrems II), companies have found themselves with limited legal options on which to rely for the transfer of personal data to the United States from the European Economic Area (EEA). With the publication of new standard contractual clauses (SCCs) and continued efforts to replace the EU-U.S. Privacy Shield Framework (Privacy Shield), there may be relief on the horizon.

I. Schrems II Background and Impact

The General Data Protection Regulation (GDPR) restricts the transfer of EU personal data outside of the EEA unless the rights of the individuals with respect to their personal data are protected by the laws of the receiving country. Companies are permitted to make cross-border data transfers only if certain mechanisms are in place. These mechanisms include SCCs, binding corporate rules, and, prior to Schrems II, the Privacy Shield, limited to transfer to the United States.

In Schrems II, the CJEU invalidated the Privacy Shield as a legal mechanism for the transfer of personal data out of the EEA. At the time, more than 5,000 companies, particularly small and medium-sized businesses, relied on the Privacy Shield to transfer personal data from the EEA to the U.S. In the Schrems II decision, the CJEU confirmed that companies may rely on SCCs for the transfer of data outside of the EEA. However, the CJEU cautioned that the validity of transfers using SCCs turns on whether the transferred data can be afforded a level of protection “essentially equivalent” to that guaranteed by the GDPR. As a result, data exporters and importers may be required to conduct transfer impact assessments and implement additional safeguards in order to transfer personal data from the EEA.

II. Privacy Shield Replacement Updates

To address the gap created by the Schrems II decision, the U.S. Department of Commerce and the European Commission are negotiating a new agreement to replace the Privacy Shield. EU Commissioner for Justice Didier Reynders and U.S. Secretary of Commerce Gina Raimondo issued a joint statement that the U.S. and the EU will “intensify negotiations on an enhanced EU-U.S. Privacy Shield that” will be compliant with Schrems II. In addition, President Biden is reportedly hoping to secure a political agreement with European Commission President Ursula von der Leyen during the EU-U.S. Summit in Brussels in June 2021 to “lay the groundwork for a new transatlantic data transfer deal.” The process of developing a Privacy Shield replacement will take significant time, and, in the meantime, companies are increasingly relying upon SCCs for the legal transfer of personal data from the EEA to the U.S.

III. Use of SCCs Post-Schrems II and Publication of New SCCs

Absent a Privacy Shield replacement, companies wishing to transfer data between the EEA and the U.S. may still use SCCs. The SCCs are template data transfer agreements that allow data exporters to transfer personal data from the EEA to countries outside of the EEA that have not received an adequacy decision. Exporters using SCCs must evaluate the legal landscape of the recipient jurisdiction and take “supplementary measures” necessary to ensure that data is protected at the level required under the GDPR. The Schrems II decision provides little guidance on what exactly this assessment should entail. The European Data Protection Board (EDPB) issued draft recommendations in November 2020 that outline criteria to be considered when evaluating international transfers of personal data. The recommendations are expected to be finalized later this month, and supervisory authorities in EEA countries will develop their own guidance, in coordination with the EDPB, to ensure consistency in the application of EU data protection law.

On June 4, 2021, the European Commission (EC) published the final version of the new SCCs, which replace the previous set of SCCs that have been in force for more than a decade. The new SCCs, which were drafted to align with GDPR and address the requirements of Schrems II, “combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains.” Data importers and exporters may select modules that are applicable to the particular data transfer scenario. The new SCCs cover four types of transfers: controller-to-controller, controller-to-processor, processor-to-controller, and processor-to-subprocessor. The new SCCs also build in provisions specifically tailored to address compliance with Schrems II. For example, the new SCCs impose an obligation on data exporters and importers to conduct an assessment to determine whether the data importer in the third country can guarantee an adequate level of protection for transferred personal data. The EC’s decision adopting the new SCCs will enter into force 20 days from its publication in the Official Journal of the European Union and will go into effect three months after that date. However, organizations that currently rely on the old SCCs for the cross-border transfer of data from the EU will have an additional 15-month grace period to transition to the new SCCs.

The UK Information Commissioner’s Office (ICO) is also developing new SCCs to facilitate transfers of personal data outside of the UK. The UK left the EU in 2020 and is no longer subject to the GDPR. The UK has its own data protection law that largely mirrors the GDPR, and the ICO permits companies to rely on the current EU SCCs for the international transfer of personal data. However, the new EU SCCs are not valid for use in the UK. Instead, the ICO intends to publish draft SCCs for data transfers from the UK for public consultation later this summer.

According to a report by the International Association of Privacy Professionals and FTI Consulting, 88% of companies that transferred data outside of the EEA in 2020 did so on the basis of the previous SCCs. The changes to the SCCs will have a significant impact on these businesses and may present technical and organizational challenges. In light of these developments, businesses are proactively reviewing their international data flows and keeping abreast of changes that may impact their cross-border data transfers.

Children’s Privacy Regulation: Sea of Cs (COPPA, Congress, CCPA/CPRA, CDPA)

Following a year lived largely online, concerns related to children’s online privacy have become even more salient in the eyes of certain lawmakers and regulators. While children’s privacy has long been a focus of lawmakers and enforcement authorities alike, actions at both the federal and state levels over the past year show signs of a renewed interest in expanding existing laws related to children and regulating teen privacy. Children’s privacy issues also continue to garner bipartisan interest and support within Congress, state legislatures, and the Federal Trade Commission (FTC or the Commission).

In Congress, lawmakers have emphasized children’s and teens’ privacy by introducing numerous pieces of legislation that would expand the existing Children’s Online Privacy Protection Act (COPPA), including amending the law to apply to data collection from teens, and by holding hearings related to children’s activities online. For instance, in May 2021, Senators Markey (D-MA) and Cassidy (R-LA) introduced the Children and Teens’ Online Privacy Protection Act (CTOPPA). CTOPPA is a bipartisan legislative proposal to update and significantly broaden COPPA, including by requiring consent to collect personal information from users aged 13-15. Although a version of this bill has been introduced in previous sessions, Senator Markey highlighted that children and teens spent more time online over the past year and that “[i]t’s time for Congress to” act. Lawmakers may also be influenced by the Age Appropriate Design Code (AADC) in the UK, which will be enforced starting in September 2021 following a 12-month implementation period. The AADC applies to online services that are “likely to be accessed by children” under 18, a significant departure from previous children’s privacy laws in both Europe, which focus on children and younger teens under 16, and COPPA, which applies only to children under 13.

Furthermore, Congress held multiple hearings over the past few months to examine children’s and teens’ privacy in the context of increased screen time. Most recently, on May 18, 2021, the Senate Committee on Commerce, Science, & Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security convened a hearing, “Protecting Kids Online: Internet Privacy and Manipulative Marketing,” in which senators and witnesses discussed their concerns regarding data collection from and advertising to children, as well as the possibility of extending privacy protections to teens. The hearing included testimony from academics, who expressed concern that COPPA does not adequately address targeted advertising to children, and a member of the UK’s House of Lords who helped create the AADC.

Reflective of the significance of children’s and teens’ privacy, state legislatures have also proposed or enacted privacy legislation that addresses children’s privacy. The recently enacted Virginia Consumer Data Protection Act (CDPA) establishes requirements for “sensitive personal data,” including personal data collected from “a known child” under 13. The CDPA requires that “sensitive data concerning a known child” be processed in accordance with COPPA and requires that “controllers” conduct data protection assessments for all “sensitive personal data,” including data collected from known children. Although not enacted, the proposed Washington Privacy Act included similar obligations with respect to personal information collected from known children under 13. Notably, however, Virginia and Washington declined to follow the examples set by the CCPA and the CPRA, which apply to a broader age range that includes younger teens. Specifically, the CCPA addresses children’s privacy by creating rights and obligations specific to both children under 13 and children between the ages of 13 and 15. The forthcoming CPRA places an increased priority on compliance with these requirements by establishing increased fines for intentional violations of the law that involve personal information of children younger than 16.

While Congress and state legislatures continue to evaluate and advance issues related to children’s privacy, the FTC has remained focused on actively updating and enforcing the Children’s Online Privacy Protection Act (COPPA). The FTC commenced a review of the COPPA Rule in July 2019, citing rapid changes in technology. This review remains ongoing. More recently, in December 2020, the FTC issued orders to nine video streaming and social media companies seeking information about their information practices. According to the FTC, the purpose of the orders included helping the Commission understand how the companies’ data practices affect children and teens. Importantly, all of the above actions occurred under a Republican-controlled Commission. Since the FTC is expected by many to take a more aggressive role in enforcing alleged privacy violations under Democratic control, further FTC scrutiny of children’s online privacy is likely.

On the enforcement side, in mid-2020, the FTC announced two more settlements for alleged COPPA violations. The first settlement involved a children’s game developer, Miniclip. In its complaint against Miniclip, the FTC alleged that the company misrepresented its participation in a COPPA safe harbor program when the company’s participation had lapsed several years prior. The final settlement with Miniclip did not include monetary penalties, but prohibited the company from misrepresenting its safe harbor participation and subjects the company to compliance and recordkeeping requirements. In the second settlement, the FTC alleged that HyperBeard, an app developer, allowed third parties to collect persistent identifiers from child-directed apps to serve targeted advertising. The proposed settlement includes a $4 million penalty, with all but $150,000 of this amount suspended. These settlements highlight not only that the FTC is pursuing various theories of liability with respect to COPPA, but also that the FTC is continuing to actively enforce COPPA even as the COPPA Rule remains under consideration.

As the legal landscape regarding children’s privacy continues to evolve and shift, companies should stay apprised of new obligations and guidance, and maintain a strong compliance program with respect to personal information pertaining to children.