Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group
In this issue, we highlight the Senate Committee on Commerce, Science, and Transportation's hearing on Federal Trade Commission (FTC) oversight. We examine the Federal Communication Commission's request for comments on Section 230 rulemaking and the FTC's request for comment on proposed Fair Credit Reporting Act changes. We also discuss the Department of Commerce's statement on the EU–U.S. Privacy Shield. In the states, we explore the Massachusetts Attorney General's announcement of a Data Privacy and Security Decision and the New York State Department of Financial Services' first enforcement action under the NY Shield Act. Across the pond, we address the United Kingdom's Information Commissioner's Office statement on Age Appropriate Design Code. In Brazil, we provide an update on the General Data Protection Law, effective August 1, 2021.
Heard on the Hill
Senate Commerce Committee Holds Federal Trade Commission Oversight Hearing
On August 5, 2020, the Senate Committee on Commerce, Science, and Transportation (Committee) convened a hearing on the oversight of the Federal Trade Commission (FTC). The witnesses were the five sitting FTC Commissioners: Chairman Joe Simons, Noah Phillips, Christine Wilson, Rohit Chopra, and Rebecca Slaughter. The Committee discussed, among other items: data privacy and security, children's privacy during the ongoing pandemic, Section 230 of the Communications Decency Act (Section 230), and potential federal privacy legislation.
During opening statements, Committee Chairman Roger Wicker (R-MS) praised the FTC's efforts on data privacy, robocalls, and false advertisements, among other efforts. He expressed support for the U.S. having a "uniform" data privacy law. FTC Chairman Simons noted that a federal privacy framework should include civil penalties, APA rulemaking authority to account for constant innovation in the tech space, and enforcement abilities for businesses typically out of the FTC's reach like nonprofits. Commissioner Wilson agreed with Chairman Simons inclusions for a federal law. Commissioner Phillips added that the current U.S. privacy regime has "gaps," as privacy laws only cover certain types of individuals and information. Commissioner Slaughter echoed her colleagues calls for a federal data privacy framework, adding that it should not hinder innovation in ways that the EU's General Data Protection Regulation (GDPR) does.
Data privacy and security and potential federal privacy legislation took precedence at the hearing. During questioning, senators expressed concern with the collection and sale of personal data in sensitive locations, such as churches and protests, as well as data collection by foreign apps. Commissioner Slaughter stated that current notice and consent practices for data collection and use do not enable increased consumer understanding around data privacy. In response to a question from Sen. Shelley Moore Capito regarding children's privacy and the Children's Online Privacy Protection Act (COPPA), Chairman Simons stated that the FTC has released guidance related to children's privacy amidst the pandemic and surge in children's online activity.
Sens. Amy Klobuchar (D-MN), Jerry Moran (R-KS), Rick Scott (R-FL), and Marsha Blackburn (R-TN) questioned if data privacy legislation is necessary, and if so, what features would be beneficial. All five Commissioners stated that some variation of federal privacy legislation is needed. International privacy was also a focus of the hearing. Chairman Simons stated that the FTC would continue to enforce EU–U.S. Privacy Shield compliance as it had prior to the inadequacy ruling, adding that the FTC is working with the Department of Commerce to move forward with European counterparts.
Online platforms were also in the spotlight. In response to a question from Committee Chairman Wicker related to President Trump's Executive Order on Preventing Censorship, Chairman Simons stated that the FTC does not have jurisdiction over political speech, just commercial speech. Commissioner Chopra added that he is concerned with Section 230 immunity, which grants online platforms protection from civil liability for content posted on such platforms, for behavioral advertising and surveillance.
Around the Agencies and Executive Branch
Federal Communications Commission Seeks Comments on Section 230 Petition for Rulemaking
On August 3, 2020, the Federal Communications Commission (FCC) opened a comment period for responding to the National Telecommunications and Information Administration's (NTIA) Petition for Rulemaking Related to Section 230 of the Communications Act of 1934 (Section 230). NTIA's petition was prompted by President Donald Trump's Executive Order No. 13925: Preventing Online Censorship (Order 13925) issued on May 28, 2020. NTIA requested that the FCC issue new regulations to implement CDA 230 by clarifying when a company can rely on the liability protections the law provides, and when content moderation decisions are "taken in good faith" under the law.
Section 230 was passed in 1996 when Congress enacted 47 U.S.C. § 230. Congress stated that, through Section 230, it sought to "preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation."47 U.S.C. § 230(b)(2). To help enable this policy goal, Section 230 provides civil liability protection for providers of online services for voluntary good faith actions taken to restrict access to material that the provider believes to be obscene, lewd, excessively violent, harassing, or otherwise objectionable, whether or not the material is constitutionally protected, or to enable technologies to restrict access to such materials. 47 U.S.C. § 230(c)(1-2).
Order 13925 and the NTIA's petition state that the content moderation policies of large social media platforms, such as indicating misleading information, deleting certain posts, or promoting some posts in social media feeds over others, are harming free speech and that Section 230 should not extend its liability protections to platforms engaged in such moderation. The NTIA also requested that platform providers be required to provide increased transparency regarding their moderation practices. The FCC's comment period closed September 3rd for initial comments, and September 18th for reply comments. Following a review of the comments, the FCC will decide whether to begin a formal rulemaking at a future public meeting.
Federal Trade Commission Seeks Comments on Proposed FCRA Rule Changes
On August 24, 2020, the Federal Trade Commission (FTC) announced that it is seeking public comment on proposed changes to rules promulgated under the Fair Credit Reporting Act (FCRA). The FTC will publish five separate Notices of Proposed Rulemaking (NPRMs) to align its implementing regulations under the FCRA with the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act). The Dodd-Frank Act was enacted in 2010 and transferred most FCRA rulemaking authority from the FTC to the Consumer Financial Protection Bureau (CFPB). The FTC retained rulemaking authority for rules promulgated under FCRA that apply to motor vehicle dealers, as defined under the Dodd-Frank Act.
The NPRMs contain minimal changes that are intended to align the rules with the FTC's narrowed authority under the Dodd-Frank Act. The FTC proposes to clarify that the FTC's FCRA regulations apply only to motor vehicle dealers. In addition, each NPRM contains several questions soliciting public comment.
The NPRMs propose the following clarifications to the FTC's FCRA rules:
- Address Discrepancy Rule. Under the proposed rule, motor vehicle dealers who receive a notice of an address discrepancy from a nationwide consumer reporting agency (CRA) must follow the obligations outlined under the rule.
- Affiliate Marketing Rule. Under the proposed rule, consumers may prevent a motor vehicle dealer from using information obtained from an affiliate to make solicitations to the consumer about its products and services.
- Furnisher Rule. Under the proposed rule, motor vehicle dealers who provide information to CRAs must establish and implement written policies and procedures regarding the accuracy and integrity of consumer information provided to a CRA.
- Pre-screen Opt-Out Notice Rule. Under the proposed rule, motor vehicle dealers who use consumer report information to make unsolicited credit or insurance offers to consumers must follow the content requirements outlined in the rule.
- Risk-Based Pricing Rule. Under the proposed rule, motor vehicle dealers who engage in risk-based pricing must provide notice to consumers when a consumer report is used to offer less favorable terms to a consumer.
Written comments may be submitted online or on paper and are due seventy-five days after the NPRMs are published in the Federal Register.
U.S. Department of Commerce Issues Statement with European Commission Regarding Talks About EU-U.S. Privacy Shield
On August 10, 2020, U.S. Secretary of Commerce Wilbur Ross and European Commissioner for Justice Didier Reynders announced that they have "initiated discussion to evaluate the potential for an enhanced EU–U.S. Privacy Shield framework." This announcement follows the Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (Schrems II) decision by the Court of Justice of the European Union (CJEU) which invalidated the existing EU–U.S. Privacy Shield framework (Privacy Shield).
The Privacy Shield was developed by the U.S. Department of Commerce and the European Commission (EC) to provide an alternative mechanism for data transfers between the European Union and the U.S. The General Data Protection Regulation (GDPR) restricts the transfer of personal data to "third countries" unless such countries provide an adequate level of protection to that provided under the GDPR. The Privacy Shield was approved in 2016 by the EC as an adequate transfer mechanism but was invalidated by the CJEU on July 16, 2020. The July 16 decision marks the second time that the CJEU has invalidated a framework for transfers between the United States and the European Union. The CJEU invalidated the Privacy Shield's predecessor, the U.S.–EU Safe Harbor, in 2015.
Because EU–U.S. data transfers relate to transatlantic commerce, the U.S. Department of Commerce took the primary role in negotiations for the Privacy Shield and will be responsible for negotiating a new, "enhanced" framework to address the concerns set forth in Schrems II.
In the States
Massachusetts Attorney General Announces Data Privacy and Security Division
On August 13, 2020, Massachusetts Attorney General (AG) Maura Healey announced the creation of a new division of her office focused on data privacy and security. The Data Privacy and Security Division was formed "to protect consumers from the surge of threats to the privacy and security of their data," according to the AG's announcement. AG Healey said the division would build on her office's "commitment to empowering Massachusetts consumers in the digital economy, ensuring that companies are protecting personal data, and promoting equal and open access to the internet."
The AG's press release stated that Sara Cable has been named Division Chief of the new division. Ms. Cable has been the Director of Data Privacy and Security in the AG's Consumer Protection Division since 2016, where she has led the Office of the Attorney General's cybersecurity and data privacy investigations and enforcement actions, including multistate actions. As Division Chief, Ms. Cable is tasked with leading the new Data Privacy and Security Division and enforcing the Massachusetts Consumer Protection Act and the state's data breach notification law.
New York Department of Financial Services Pursues First Enforcement Action Under Its Cybersecurity Regulations
On July 21, 2020, the New York Department of Financial Services (NYDFS) brought its first enforcement action under 23 NYCRR §§ 500-523 (the "Cybersecurity Regulation") against a title insurer. In its Statement of Charges and Notice of Hearing, NYDFS alleged that the company ignored a known vulnerability in its document management system, allowing anyone with a web browser to access hundreds of millions of documents containing sensitive personal information of tens of millions of consumers. Affected data included bank account information and statements, Social Security numbers, drivers' license images, and mortgage and tax information.
NYDFS also took note of the fact that the company allegedly failed to follow its own internal policies, neglecting to conduct required security reviews and risk assessments, and failing to investigate the vulnerability within the timeframe dictated by company policy. Indeed, NYDFS alleged that the company discovered the vulnerability during a penetration test in late 2018, but only classified it as "low" severity; the company failed to follow the recommendations of its internal cybersecurity team to investigate further and did not address the vulnerability until a prominent cybersecurity journalist published an article revealing the existence and scope of the issue. Potential exposure is significant, as NYDFS claims that each "instance of Nonpublic Information encompassed within the charges constitutes a separate violation carrying up to $1,000 in penalties."
The Cybersecurity Regulation became effective in March 2017 (although full compliance with certain portions of the Cybersecurity Regulation was not required until two years later) and requires covered entities to implement specific cybersecurity policies and safeguards to protect customers' "nonpublic information." Covered entities are required to, among other steps, implement, maintain, and update cybersecurity and incident response policies and programs that meet specific requirements; conduct periodic cyber risk and vulnerability assessments; appoint a Chief Information Security Officer; engage qualified cybersecurity personnel; provide cyber awareness training for all personnel; implement policies to address security requirements and diligence for vendors; and annually certify compliance with the Cybersecurity Regulation to NYDFS.
United Kingdom's Information Commissioner's Office Issues Statement on Age Appropriate Design Code
As of September 2, 2020, the UK Information Commissioner's Office's (ICO) Age Appropriate Design Code (Code) is in effect after being issued by the ICO on August 12, 2020. The Code applies to online services that are "likely to be accessed by children" in the UK—it is not restricted to services that are directed to children.
The Code establishes fifteen (15) standards of age appropriate design that online services will need to implement to ensure their services safeguard personal data about children and fairly process personal data about children. According to the ICO, the standards are not technical standards but are instead a set of "technology-neutral design principles and practical privacy features." In addition, the ICO has noted that the Code will help organizations design services that conform with the General Data Protection Regulation (GDPR) in the context of children using digital services.
The standards are: (1) best interests of the child; (2) data protection impact assessments; (3) age appropriate application; (4) transparency; (5) detrimental use of data; (6) policies and community standards; (7) default settings; (8) data minimization; (9) data sharing; (10) geolocation; (11) parental controls; (12) profiling; (13) nudge techniques; (14) connected toys and devices; and (15) online tools.
Although the Code does not have the force of law, it is a "statutory code of practice" and organizations that are not compliant with the Code may be considered to be in violation of the GDPR. Specifically, the ICO has explained that organizations that do not conform with the Code are "likely to find it more difficult to demonstrate that your processing is fair and complies with the GDPR[.]" The Code includes a twelve (12) month transition period; therefore, organizations should conform with the Code by September 2, 2021.
Brazil Moves Forward With General Data Protection Law
After months of uncertainty, the Lei Geral de Proteção de Dados Pessoais, or LGPD, Brazil's General Data Protection Law, is expected to come into effect by September 16, 2020.
The LGPD, which passed on August 14, 2018, is comprehensive privacy and data security legislation that will be Brazil's first national data protection regulation, replacing Brazil's previous sectoral approach. The law was originally set to take effect on August 16, 2020. Due to the COVID-19 pandemic, however, on April 29, 2020, Brazil's President Jair Bolsonaro issued Provisional Measure 959 to delay the LGPD's substantive provisions from taking effect until May 3, 2021. After competing legislative proposals from the lower and upper houses of the National Congress of Brazil, the Federal Senate adopted a proposal for the substantive provisions of the LGPD to take immediate effect upon the President's approval. President Bolsonaro has 15 business days, or until September 16, 2020, to sanction or veto the proposal. If the President does not take any action, the LGPD will become effective at the end of the 15-business day period. Pursuant to Law No. 14,010 enacted on June 10, 2020, the sanction and enforcement provisions of the LGPD will become effective August 1, 2021.
The LGPD expressly applies to any private or public individual or company with personal data processing activities that are: (i) carried out in Brazil; (ii) where the purpose of the processing activity is the supply of goods or services to individuals located in Brazil; or (iii) if the processed personal data has been collected in Brazil. The LGPD does not apply to data processing performed by an individual for private and non-economic purposes, for journalistic and artistic purposes, certain academic purposes, or for purposes of public security, national defense, or criminal investigations.
Under the LGPD, personal data may only be processed under one of the ten enumerated legal bases. The ten legal bases include consent of the data subject, for compliance with a legal or regulatory obligation by the controller, or by the public administration, for the processing and shared use of data necessary for the execution of public policies.
The LGPD also establishes nine fundamental rights for Brazilian data subjects, such as the right to access, correction, data portability, and the right to erasure as well as a right to review decisions made by automated means.
On August 26, 2020, President Bolsonaro issued Decree 10,474 which establishes the Brazilian National Authority for Protection of Data (ANPD) which will compose of a board of directors, a national council, an inspection body, an ombudsman body, its own legal advisory body, and administrative and specialized units for the enforcement of the LGPD. The ANPD will oversee and impose administrative sanctions for violations of the LGPD as well as issue rules and regulations related to data protection and privacy.