November 20, 2020

The Download November 2020

17 min

Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group

In this issue, we highlight the Federal Communications Commission's Section 230 of the Communications Decency Act (Section 230) Rulemaking announcement, the Federal Trade Commission's advertising law workshop, and the Consumer Financial Protection Bureau's financial data access rulemaking plan. In the courts, we examine U.S. Supreme Court Justice Clarence Thomas's statement on Section 230. Around the states, we explore the California Attorney General's third modification of the California Consumer Privacy Act Regulations. In international developments, we highlight the United Kingdom's Information Commissioner's blog on privacy during COVID-19, the European Data Protection Board's 39th and 40th Plenary Sessions, and China's personal data protection law. We hope you have a safe and happy Thanksgiving.

Around the Agencies and the Executive Branch

Federal Communications Commission Chairman Pai Announces Section 230 Rulemaking Examination

On October 15, 2020, Federal Communications Commission (FCC) Chairman Ajit Pai announced that the FCC would examine Section 230 of the Communications Decency Act's (Section 230) liability shield. Passed in 1996, Section 230 provides protections for websites that host user content. The Section 230 liability shield protects providers of interactive computer services from civil liability if the provider chooses to, in good faith, "restrict access to or availability of" content that they consider to be objectionable, "whether or not such material is constitutionally protected." 47 U.S.C. § 230(c)(1-2). President Donald Trump's May Executive Order No. 13925: Preventing Online Censorship directed the FCC to investigate content moderation protections for online platforms.

Chairman Pai stated the FCC's examination intends to clarify any ambiguities in Section 230's meaning. He also highlighted concerns about the current interpretation of the liability shield from all branches of government, citing U.S. Supreme Court Justice Clarence Thomas's October 13, 2020 statement regarding the extent of Section 230 protections.

Chairman Pai emphasized that an "overly broad" interpretation of Section 230 protects social media companies beyond the text of the law. Following Chairman Pai's announcement, FCC General Counsel Thomas Johnson Jr. released an October 21, 2020 blog post publishing his analysis of the FCC's legal authority to interpret Section 230. In his analysis, Mr. Johnson stated that the FCC's authority to interpret Section 230 is "straightforward." Mr. Johnson cited Section 201 of the Communications Decency Act, which he said grants the FCC the ability to "prescribe such rules and regulations as may be necessary" to carry out the act. 47 U.S.C. § 201(b). Chairman Pai noted that he intends to move forward with the Rulemaking process for Section 230, consistent with Mr. Johnson's advice.

Federal Trade Commission Hosts Advertising Law Workshop

On October 29, 2020, the Federal Trade Commission (FTC) held a workshop, "Green Lights and Red Flags: FTC Rules of the Road for Business." The workshop was hosted virtually from Cleveland, Ohio, and addressed such topics as data breaches, children's privacy, cyber responses, and privacy policies.

Presenters, moderators, and panelists included the FTC, the Federal Bureau of Investigation (FBI), the Office of the Ohio Attorney General, the Cuyahoga County (Ohio) Department of Consumer Affairs, the Better Business Bureau of Cleveland, and marketing and information technology professionals.

Jon Miller Steiger, director of the FTC's East Central Region, gave opening remarks. Presentations and panel discussions covered the following topics, among others:

  • Protecting Small Businesses from Scams. The panel addressed such topics as business-to-business fraud prevention. Panelists discussed the success of using FTC warning letters to ensure business compliance without imposing enforcement actions, and a panelist from the FTC emphasized the importance of providing small businesses with guidance on how to comply with various laws.
  • The Truth about False Advertising. A presenter discussed truth-in-advertising law and addressed the importance of business practices, such as avoiding deceptive representations in privacy policies and data collection disclosures, among other topics.
  • Avoiding a Promotion Commotion. The panel addressed social media marketing, consumer reviews, and children's privacy, among other topics. The panelist discussion included the FTC's enforcement authority under the Children's Online Privacy Protection Act (COPPA) and best practices regarding data from children online.
  • The Secure Entrepreneur. The panel addressed data security and cyber incident response. Panelists expressed support for data minimization and proper data deletion mechanisms. Several panelists emphasized the importance of contacting authorities as soon as possible in the event of a cyberattack.

"Green Lights and Red Flags" is a business workshop series that the FTC has held over the years. The FTC coordinates with regional partners in cities across the country for these workshops.

CFPB Plans Rulemaking on Consumer Access to Financial Data

On October 22, 2020, the Consumer Financial Protection Bureau (CFPB) issued an advance notice of proposed rulemaking (ANPR) regarding implementation of consumer data access rights under Section 1033 of the Dodd-Frank Act. The Dodd-Frank Act allows consumers to request, subject to CFPB rules, information from entities that offer or provide consumer financial products or services. Comments on the ANPR are due on or before February 4, 2021.

Section 1033 of the Dodd-Frank Act requires entities that offer or provide consumer financial products or services ("regulated entities") to make available to the consumer in electronic form, upon request, information within a regulated entity's possession or control regarding the consumer financial product or service the consumer obtained from the regulated entity. 12 U.S.C. § 5533(a). This includes information relating to transactions and to the consumer's account (e.g., transaction lists, costs, charges, and usage data), but excludes confidential commercial information, information collected for fraud and money laundering prevention, information kept confidential by other laws, and information that the regulated entity "cannot retrieve in the ordinary course of its business with respect to that information." 12 USC 5533(b).

However, this Section 1033 right of access is "[s]ubject to rules prescribed by the [CFPB]." 12 USC 5533(a). Although the Dodd-Frank Act was enacted in 2010, to date the CFPB has not promulgated any regulations to implement Section 1033. This ANPR is the latest in a number of CFPB actions regarding the Section 1033 right of access, including a 2016 Request for Information, a 2017 Stakeholder Insights Report, and a 2020 symposium.

The ANPR seeks comment from stakeholders regarding the following topics:

  1. Costs and benefits of consumer data access;
  2. Competitive incentives;
  3. Standard-setting;
  4. Access scope;
  5. Consumer control and privacy;
  6. Other legal requirements (for example, the CFPB noted that the Gramm-Leach-Bliley Act (GLBA), the Fair Credit Reporting Act (FCRA), or the Electronic Fund Transfer Act (EFTA) might also implicate or apply to access requests under Section 1033);
  7. Data security;
  8. Data accuracy; and
  9. Other information. 81 FR 71009–11.

The CFPB has provided a number of more targeted and specific questions for each of these topics in the ANPR.

In the Courts

Supreme Court Justice Thomas Issues Statement on Section 230 Protections

In a statement illustrating the U.S. Supreme Court's decision not to hear a case involving Section 230 of the Communications Decency Act, Supreme Court Justice Clarence Thomas suggested that the Supreme Court should take an opportunity in the future to examine whether the text of Section 230 supports the interpretation often given to it by lower courts. Section 230 provides a liability shield for "provider[s] or user[s] of interactive computer services," including social media platforms, that engage in "good faith blocking or screening of offensive material."

Justice Thomas stated that courts have interpreted Section 230 broadly enough to grant "Internet companies" immunity for their own content—specifically, content solicited and edited by those companies prior to publication. He further suggested that courts have extended immunity so far that there are "no limits on an Internet company's discretion to take down material" and that "§230 now apparently protects companies who racially discriminate in removing content." Justice Thomas also noted that courts have granted immunity under Section 230 "to protect companies from a broad array of traditional product-defect claims."

Justice Thomas concluded by asserting that the Supreme Court should take advantage of a future case that provides the appropriate opportunity to examine lower courts' interpretations of Section 230.

In the States

California Attorney General Releases Third Set of Proposed Modifications to CCPA Regulations

On October 12, 2020, the California Office of the Attorney General (CA AG) released a third set of proposed modifications to the regulations implementing the California Consumer Privacy Act (CCPA). On the same day, the CA AG initiated a public comment period to obtain input on the proposed changes. Comments on the proposed modifications were due to the CA AG on October 28, 2020. The proposed modifications would revise the regulations finalized on August 14, 2020 and address:

  • Offline Notice of the Right to Opt Out of Personal Information Sales. The proposed modifications would require businesses that collect personal information from consumers offline to provide a notice of the right to opt out through an offline method. By way of example, the proposal notes that businesses collecting personal information in brick-and-mortar stores may provide such notice by printing the notice on paper forms where personal information is collected or by posting signs in the area where personal information is collected that direct consumers to the online notice. Additionally, businesses collecting personal information over the phone could provide a notice of the right to opt out orally during the call when the information is collected.
  • Methods and Process for Opt-Out Requests. The proposed modifications would require a business to provide consumers with methods of submitting opt-out requests that are easy to execute and require minimal steps. For example, a business's opt-out process may not require more steps than the process for opting into sales after previously opting out. Additionally, businesses may not use confusing language, require consumers to click through or listen to reasons as to why they should not submit a request to opt out, require consumers to provide more personal information than necessary to implement the request, or require consumers to search or scroll through a privacy policy or similar document in order to locate the mechanism for submitting an opt-out request.
  • Authorized Agents. The proposed modifications would enable a business to require an authorized agent submitting a request on behalf of a consumer to provide proof that the consumer gave the agent signed permission to submit the request.

The CA AG is now reviewing the comments received during the public comment period. The agency's review is under way, even though California voters approved Proposition 24, the California Privacy Rights Act of 2020 (CPRA) ballot initiative, at the polls during the general election this month. The CPRA will create an entirely new agency in the state of California to issue regulations implementing the new law and to enforce its terms. However, Californians' approval of the CPRA does not preclude the CA AG's review of the most recent proposed updates to the CCPA regulations.

International

United Kingdom's Information Commissioner Writes Blog on Privacy During the COVID-19 Pandemic

On October 13, 2020, United Kingdom (UK) Information Commissioner Elizabeth Denham published a blog post that highlighted positive results of the Information Commissioner's Office (ICO) engagement with UK devolved administrations—Scotland, Wales, and Northern Ireland—on the use of data to combat COVID-19.

The blog post explained that "people's privacy rights" are being considered by devolved administrations in developing applications and services to address COVID-19. To address privacy rights, Information Commissioner Denham highlighted that the ICO has worked closely with the devolved administrations since the start of the pandemic to ensure that COVID-19-related projects adopt a "privacy by design approach." Specifically, the ICO has provided advice and guidance on contact tracing programs, the collection of customer details, and Data Protection Impact Assessments (DPIAs) for "proximity apps" in Northern Ireland and Scotland. In addition, the ICO provided feedback to devolved administrations on various areas, such as automated decision making and providing information to individuals regarding information rights.

Information Commissioner Denham indicated that considering privacy rights "at the heart" of applications and services combating COVID-19 is critical to the success of such applications and services, as privacy-protective measures allow individuals to have confidence when providing data pertaining to them. For entities seeking guidance related to how to collect personal data for COVID-19 purposes, Information Commissioner Denham reiterated that the ICO's offices remain available to provide guidance to stakeholders to ensure that privacy continues to be protected during the pandemic.

European Data Protection Board Hosts 39th and 40th Plenary Sessions in October

On October 8, 2020, the European Data Protection Board (EDPB) met for its 39th plenary session. During the session, the EDPB adopted Guidelines 9/2020 on the concept of relevant and reasoned objection under Regulation 2016/679 (the Guidelines). The Guidelines relate to the cooperation and consistency provision related to enforcement actions set out in Chapter VII of the General Data Protection Regulation (GDPR). Under the cooperation procedures in Article 60, lead supervisory authorities (LSA) and supervisory authorities have a duty to exchange all relevant information with each other in an endeavor to reach consensus when coordinating cross-border investigations in the European Union. The LSA must submit a draft decision to concerned supervisory authorities for their opinion, and take due account of their view. The other concerned supervisory authorities may raise a relevant and reasoned objection to the draft decision within a period of four weeks. Upon review of the relevant and reasoned objection, the LSA may either follow the suggestions of the other concerned supervisory authorities and produce a revised draft decision, or disagree with the objections and submit the matter to the EDPB for consideration under the GDPR's consistency mechanism.

The Guidelines are intended to establish a common understanding of the notion "relevant and reasoned," including what should be taken into consideration when assessing whether an objection clearly demonstrates the significance of the risks posed by the draft decision. For an objection to be considered "relevant," the Guidelines provide that there "must be a direct connection between the objection and the draft decision at issue," and "the objection needs to concern either whether there is an infringement of the GDPR or whether the envisaged action in relation to the controller or processor complied with the GDPR." In order for an objection to be "reasoned," the Guidelines provide that it must include "clarifications and arguments as to why an amendment of the decision is proposed" and demonstrate "how the change would lead to a different conclusion as to whether there is an infringement of the GDPR or whether the envisaged action in relation to the controller or processor complies with the GDPR." The Guidelines also provide practical examples for determining whether an objection is relevant and reasoned. The Guidelines are open for public consultation until November 24, 2020.

During its 40th plenary session on October 20, 2020, the EDPB adopted a final version of the Guidelines on Data Protection by Design & Default (Guidelines on DPbDD), which focuses on controllers' implementation of DPbDD based on the obligation in Article 25 of the GDPR. The requirement described in Article 25 is for controllers to have data protection designed into the processing of personal data and as a default setting, and this applies throughout the processing life cycle. The Guidelines on DPbDD list key design and default elements as well as practical cases for illustration. The EDPB notes that controllers in industry, processors, and producers should use DPbDD as a "means to achieve a competitive advantage when marketing their products towards controllers and data subject."

In addition to adoption of the final Guidelines on DPbDD, the EDPB decided to create a Coordinated Enforcement Framework (CEF). The CEF would provide a structure for coordinating recurring annual activities by EDPB Supervisory Authorities. The CEF is designed to facilitate joint actions in a flexible and coordinated manner, promote compliance, empower data subjects to exercise their rights, and raise awareness.

China Unveils a Draft of Its Personal Data Protection Law

On October 21, 2020, the Standing Committee of China's National People's Congress released a draft of its Personal Information Protection Law (PIPL). The draft PIPL contains 8 chapters and 70 articles, covering the following topics: (1) the processing of personal information; (2) the rights of data subjects; (3) rules for handling sensitive personal information; (4) cross-border transfers; (5) obligations of personal information handlers; and (6) legal liability. Public comment on the PIPL is open until November 19, 2020.

Key provisions of the draft PIPL are summarized below.

Key Definitions.

Personal Information Handler. The PIPL would apply to "personal information handlers," which is defined as organizations or individuals who independently determine the purpose and methods of the processing of personal information.

Personal Information. Personal information is defined as information recorded by electronic or other means related to an identified or identifiable natural person. Anonymized information is excluded from this definition.

Extraterritorial Effect. Under Article 3 of the PIPL, the law would apply to information handlers who process PI both within China and abroad.

Lawful Basis for Processing. The PIPL would provide six lawful bases for the processing of personal information:

(1) Consent. If an entity processes data based on the data subject's consent, the consent must be informed, specific, freely given, and an indication of the wishes of the data subject;

(2) Performance of a contract to which the data subject is a party;

(3) Fulfillment of statutory duties or obligations;

(4) Responding to public health incidents or necessary for the protection of life, health, and property of the data subject or other individuals in emergent cases;

(5) Journalism or media supervision in the public interest; or

(6) Other circumstances as provided by Chinese laws and regulations.

Data Subject Rights. The PIPL would provide data subjects with the following rights: (1) the right to know and the right to decide relating to their personal information; (2) the right to limit or object to the processing of personal information by others; (3) the right to access and to copy personal information from information handlers; (4) the right to correct or complete; (5) the right to deletion, in certain circumstances; (6) the right to an explanation of the personal information handling rules; and (7) the right to withdraw consent. Personal information handlers must establish a mechanism for data subjects to exercise their rights.

Sensitive Personal Information. Sensitive personal information may only be processed for specific purposes and when "sufficiently necessary." In addition, the processing of sensitive personal information would require separate opt-in consent. The term "sensitive personal information" is defined as "personal information that, once leaked or illegally used, may cause discrimination against individuals or grave harm to personal or property security, including information on race, ethnicity, religious beliefs, individual biometric features, medical health, financial accounts, individual location tracking, etc."

Children. Parental consent would be required for the processing of personal information of minors below the age of 14 if the personal information handler knows or should know that it is processing the data of a child.

Disclosure to a Third Party. Personal information handlers that provide personal information to a third party would be required to inform the data subject of the identity and contact information of the third party, the purpose of the data processing, and the processing mode and type of personal information covered. The personal information handler must obtain separate consent from the data subject to permit the transfer or sharing of personal information with the third party.

Automated Decision Making. Personal information handlers that use automated decision making (ADM) would be required to guarantee transparency of the decision making, as well as fairness in the result. If an individual believes that the use of ADM created a "major influence on their rights and interests," the individual would have the right to require a personal information handler to provide an explanation, and the right to refuse decision making solely through ADM. In addition, if an entity uses ADM to conduct commercial sales or push messages, individuals may choose to have the handler not conduct marketing or push messages that target their personal characteristics.

Cross-Border Transfer. A personal information handler would be required to obtain the separate consent of data subjects prior to engaging in a cross-border data transfer. The PIPL would provide the three mechanisms for cross-border data transfers. Should an entity wish to transfer personal information to foreign authorities, the PIPL would require it to receive prior approval from Chinese regulators.

Data Localization. A personal information handler that processes personal information of a certain volume in China would be subject to a data localization requirement. If the personal information handler wished to engage in a cross-border transfer of the data, the transfer would be subject to a security assessment by the Cyberspace Administration of China (CAC). The data threshold for this requirement will be provided by the CAC.

Obligations of a Personal Information Handler. A personal information handler would be required to implement various administrative policies and procedures and establish technical security measures to protect personal information. These measures would include regular compliance audits, risk assessments, maintaining records of certain processing activities, data breach incident response plans, employee training, and the appointment of a data protection officer, in certain circumstances. In the event of a data security incident, the personal information handler would be required to adopt remedial measures and provide notice to the government.

Obligations of Foreign Personal Information Handlers. In addition, foreign personal information handlers would be required to establish a dedicated entity or appoint a representative within China to be responsible for matters related to the personal information they handle.

Penalties. Personal information handlers that commit "serious" violations of the PIPL could be fined up to 50 million Yuan (7.4 million USD) or up to 5% of the prior year's revenue. The government may also order the suspension of business activities or the cancellation of business permits or professional licenses. A serious violation would include the illegal processing of personal information or failure to adopt certain measures to protect personal information. Responsible personnel of a violating personal information handler may be subject to up to 1 million Yuan in fines (151,254 USD).

Blacklist. Foreign personal information handlers that process personal information in a manner that harms the rights of Chinese citizens or endangers Chinese national security or public interest, may be placed on a public blacklist of parties restricted or prohibited from receipt of personal information. The list would be compiled by the CAC.