July 14, 2020

The Download July 2020

14 min

Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group

In this issue, we discuss House Speaker Nancy Pelosi's remarks on advertisers' impact on misinformation online and the Association of National Advertisers' response to Rep. Adam Schiff's letter. Around the agencies, we highlight a Federal Communications Commission record-breaking robocall fine, a Federal Trade Commission (FTC) settlement for an alleged Fair Credit Reporting Act violation, and an FTC settlement for an alleged Children's Online Privacy Protection Act violation. In California, we explore an Assembly Appropriations hearing, which discussed a facial recognition technology bill and a health privacy bill, and we discuss a hearing on the California Privacy Rights Act Ballot Initiative. Across the pond, we summarize the European Data Protection Board's 30th and 31st Plenary Sessions.

Heard on the Hill

House Speaker Pelosi Discusses Advertisers' Impact on Online Misinformation Mitigation

On June 16, 2020, U.S. House Speaker Nancy Pelosi provided opening remarks for the International Forum on COVID-19 Social Media Disinformation, a virtual forum hosted by George Washington University's Institute for Data, Democracy & Politics (IDDP). IDDP organizers have explained the forum was designed to highlight what they characterized as the false, misleading, and dangerous social media content regarding the COVID-19 pandemic that harms consumers in the United States and around the world. Speaker Pelosi's remarks focused on what she stated was the role of social media companies in accelerating the proliferation of disinformation and hate speech online. She asserted that social media companies lack an incentive to police this harmful content because disinformation often draws in users, which furthers social media companies' business models.

In her remarks, Speaker Pelosi explained that Democrats are "laser focused" on holding these social media platforms accountable, citing as one example the House Energy and Commerce Committee's June 24, 2020, hearing about the harmful effects of disinformation. Speaker Pelosi also noted that a provision of the HEROES Act, which the House passed in May, includes funds for an independent study of COVID-19 disinformation on social media. At the same time, she emphasized that congressional action is only one part of the solution to stopping the spread of disinformation. Advertisers, consumers, and others in the ecosystem also have a role to play. Speaker Pelosi in particular urged advertisers and consumers to "know their power" and use the real leverage they have to demand that social media companies regulate disinformation on their platforms, especially disinformation relating to COVID-19.

This online forum was the first in a series of events to be held by IDDP. Joining Speaker Pelosi in delivering opening remarks was Vera Jourova, vice president of the European Commission for Values and Transparency.

Rep. Adam Schiff Sent Letter to Association of National Advertisers (ANA) and Other Advertising-Oriented Agencies; ANA Responds

On May 15, 2020, Representative Adam Schiff (D-CA) sent a letter to the chief executive officers (CEOs) of several major advertising trade associations, including the Association of National Advertisers (ANA), asking them to encourage their members to amend their policies related to advertisements appearing in pandemic-related online content. Representative Schiff expressed concern that online advertisers were engaging in keyword blocking of the terms "coronavirus" and "pandemic" to prevent advertisements from being displayed alongside coronavirus-related news stories and other content. Representative Schiff noted that ad revenue to news websites had fallen by over 50 percent and resulted in the loss of thousands of jobs.

On May 21, 2020, Bob Liodice, the CEO of ANA, responded to Representative Schiff's letter on behalf of the ANA. The letter stated that the advertising industry "has moved quickly and aggressively to ensure that ad-supported news media is not adversely impacted by overly broad keyword filtering of content related to the novel coronavirus." The letter highlighted an April 2020 cross-industry report (Report) by the American Association of Advertising Agencies (4A's) Advertiser Protection Bureau, which outlined recommendations for brand safety in news environments. The Report concluded that the use of mechanisms, such as blacklists, to help brands avoid association with specific topics may limit the ability of brands or agencies to engage with certain audiences. To address this issue, the Report outlined steps that agencies and brands can take to incorporate best practices when distributing content and to determine their risk tolerance on a regular basis. The ANA explained that its members had adjusted their policies and technologies in light of the Report to address the unique issues that arose from outdated keyword filtering models related to the pandemic.

The ANA requested Rep. Schiff's assistance in addressing a key issue facing the ad-supported media industry: ad blocking by browsers and the opt-out signal requirements outlined in the California Consumer Privacy Act (CCPA) regulations. The ANA explained that the current challenge is not keyword blocking but ad blocking. Specifically, the ANA explained that CCPA regulations would require news publishers "to accept default opt-out signals — not choice affirmatively selected by a consumer," which he said would harm ad-supported news websites by significantly limiting advertising revenue. The ANA requested that Representative Schiff raise these concerns with the California Attorney General's office.

Around the Agencies and Executive Branch

Federal Communications Commission Announces Record-Breaking Fine for Robocall Violations

On June 9, 2020, the Federal Communications Commission (FCC) proposed a record $225 million fine against Texas-based telemarketers John C. Spiller and Jakob A. Mears, who allegedly used various business names including Rising Eagle and JSquared Telecom (collectively, "Rising Eagle") as part of a spoofed robocall campaign selling health insurance to American consumers. According to the FCC's Notice of Apparent Liability for Forfeiture (NAL), the formal action detailing the allegations and proposing the fine, Rising Eagle made approximately 1 billion spoofed robocalls (i.e., robocalls with false or misleading caller ID information) in the first four and a half months of 2019 in apparent violation of the Truth in Caller ID Act, which prohibits manipulating caller ID information with the intent to defraud, cause harm, or wrongfully obtain anything of value.

The FCC Enforcement Bureau's investigation found that Rising Eagle used approximately 170,000 unique caller IDs, none assigned to Rising Eagle, to make 1,047,677,198 calls falsely claiming to offer health insurance plans from health insurance companies. When consumers answered the phone and expressed interest in the health insurance plans, they were transferred to a call center unaffiliated with and not authorized by the named companies. The call center representatives would then try to persuade consumers, many of whom were on the Do Not Call Registry, to purchase short-term, limited-duration health insurance plans offered by Rising Eagle's clients. Mr. Spiller admitted to the USTelecom Industry Traceback Group, an industry group that traces and identifies the sources of illegal robocalls, that he knowingly called consumers on the Do Not Call list because he believed that targeting these consumers was more profitable. Mr. Spiller also admitted that he made millions of calls per day using spoofed caller ID information.

The NAL contains allegations that advise Rising Eagle on how it has apparently violated the law and on the amount of the proposed penalty. Rising Eagle will be given an opportunity to file a response, which the FCC will consider before taking further action to resolve the matter.

Federal Trade Commission Announces FCRA Settlement

On June 10, 2020, the Federal Trade Commission (FTC) announced it had reached an agreement with Kohl's Department Stores, Inc. (Kohl's) to settle allegations under the Fair Credit Reporting Act (FCRA). Specifically, the FTC alleged that Kohl's violated the FCRA by refusing to provide information about identity thieves' transactions to identity theft victims upon request. This is the first enforcement action that the Commission has brought under section 609(e) of the FCRA, which gives a victim of identity theft the right to request information about fraudulent transactions made using the victim's means of identification.

These disclosure obligations apply to any "business entity that has provided credit to, provided for consideration products, goods, or services to, accepted payment from, or otherwise entered into a commercial transaction for consideration with, a person who has allegedly made unauthorized use of the means of identification of the victim[.]"1 While the statute does not define "business entity," the FTC's complaint suggests that it considers any entity that engages in the practices listed above to fall within the scope of section 609(e).

According to the FTC's complaint, Kohl's refused to provide information directly to victims and provided the information only to law enforcement. The complaint acknowledged that Kohl's ultimately changed its policy and provided information directly to victims as contemplated by the statute, but the FTC alleged that this change was made only after the company received a Civil Investigative Demand from the Commission.

The settlement requires Kohl's to (1) pay a civil penalty of $220,000, (2) provide information to victims of identity theft as required by section 609(e) (or a law enforcement agency or officer identified or authorized by the victim), (3) provide information to victims who previously made requests that meet the requirements of the law, (4) provide notice to other victims who have made requests that they may be eligible to receive information about fraudulent transactions, and (5) provide notice on the Kohl's website regarding the process for making requests for such information.

Federal Trade Commission Announces COPPA Settlement with App Developer

On June 4, 2020, the Federal Trade Commission (FTC) announced a proposed settlement with a mobile application developer to resolve alleged violations of the Children's Online Privacy Protection Act (COPPA). The complaint alleged that HyperBeard, Inc. (HyperBeard) allowed third-party ad networks to collect personal information from users of child-directed apps, but HyperBeard did not provide parental notice or obtain verifiable parental consent for this collection, as required under COPPA. The complaint also named HyperBeard's CEO and managing director.

Per the terms of the proposed settlement, HyperBeard has agreed to pay $150,000 in penalties and to delete personal information it collected in violation of COPPA. Notably, the settlement includes a $4 million penalty, which will be suspended upon payment of $150,000 by HyperBeard due to the company's inability to pay the full penalty.

The Commission voted 4-1 to issue the proposed administrative complaint and to accept the consent agreement. Chairman Joseph J. Simons and Commissioner Noah Joshua Phillips issued separate statements addressing the settlement. In a statement supporting the penalty, Chairman Simons said that "[c]ivil penalties will be an ongoing discussion here at the FTC as we attempt to do justice and achieve meaningful relief for consumers. I take our obligation to assess civil penalties seriously, just as I take seriously our responsibility to fairly administer and enforce all of the laws with which we are charged." Commissioner Phillips, who voted against the settlement, issued a dissenting statement stating that the fine imposed was "too much."

In the States

California Assembly Committee on Appropriations Holds Legislative Hearing to Consider Facial Recognition and Health Information Bills

On June 2, 2020, the California Assembly Committee on Appropriations (Committee) held a hearing to consider, among other legislation, AB 2004 and AB 2261. AB 2004 would establish a pilot program to expand the use of digital credentials to communicate COVID-19 test results, and AB 2261 would regulate the use of facial recognition technologies.

AB 2004 would require the Medical Board of California to establish a pilot program to expand the use of "verifiable health credentials" to communicate COVID-19 test results or other medical test results to individuals. "Verifiable health credentials" would be defined as "portable electronic patient records issued by an authorized health care provider to a patient … for which the authenticity of the record can be independently verified cryptographically." AB 2004 was passed out of the Committee by a vote of 15-0.

AB 2261 would regulate the use of facial recognition technology by state or local public entities or natural or legal persons, such as by requiring individual consent before enrolling an image or facial template of an individual in certain facial recognition services. When discussing AB 2261, Assembly Member Ed Chau (D), the bill's sponsor, stated that AB 2261 would help regulate public and private sector entities' use of facial recognition technology and would require that entities using such technology install transparency and bias mitigation controls. A number of civil liberties organizations voiced opposition to AB 2261 during the hearing. AB 2261 was held under submission by the Committee.

AB 2004 has since passed the Assembly and is now being considered by the Senate Committee on Rules. While AB 2261 was held under submission pending review of the bill's financial implications, as the California state budget was signed into law on June 29, 2020, it is possible that the Committee will reconsider AB 2261 in the weeks ahead.

California Legislature Holds Hearing on CPRA Ballot Initiative

On June 12, 2020, the California State Assembly Committee on Privacy and Consumer Protection (Committee) held a hearing on the California Privacy Rights Act of 2020 ballot initiative (CPRA). Witnesses at the hearing included industry representatives, consumer advocates, and Alastair Mactaggart. Alastair Mactaggart is the Board chair and founder of Californians for Consumer Privacy, a nonprofit political committee that sponsored the California Consumer Privacy Act of 2018 (CCPA). Californians for Consumer Privacy is also the main proponent and drafter of the CPRA ballot initiative.

During opening statements, Committee Chair Ed Chau (D) stated that the CCPA was the "most comprehensive privacy law in the United States." He added that the CPRA would enhance privacy protections in California, add the concept of data minimization to the state's privacy law, and establish a new privacy protection agency to enforce Californians' privacy rights. Senator Bill Dodd (D) expressed support for Californians for Consumer Privacy and its efforts to protect the privacy rights of state residents. Senator Bob Hertzberg (D) voiced support for consumer-facing, public discussions to openly address concerns inherent in the CCPA. Senator Hertzberg stated that policy-making may be challenging in the current climate since the tenets of privacy legislation are constantly evolving and are informed by emerging and changing technologies.

During the hearing, Mr. Mactaggart stated that the CPRA would increase consumer awareness of privacy issues, protect the privacy of California consumers, and address complexities in the CCPA. Other witnesses addressed the timeline of the CPRA, the new consumer protections that the CPRA would provide, alleged "loopholes" in the CCPA, the CPRA's impact on consumer relations, the potential complexities that the CPRA would add to the CCPA, increased compliance costs for businesses, and the enforcement authority of the CPRA.

Committee members and witnesses focused during questioning on the timeline of the CPRA and the new consumer protections that the initiative would provide. Mr. Mactaggart stated that the CPRA would be enforceable by the state beginning in July 2023. He also stated that some industry members' suggested changes to the CPRA had been incorporated into the measure. Many witnesses discussed the increased compliance costs that businesses would incur from the CPRA's new provisions.

On June 25, 2020, California Secretary of State Alex Padilla announced that the CPRA amassed enough valid signatures to qualify for California's November 3, 2020 ballot. If the CPRA is approved by voters, it will amend the CCPA by adding consumer rights, a new state privacy enforcement agency, and additional contracting requirements, among various other changes.


European Data Protection Board Holds 30th and 31st Plenary Sessions

On June 2, 2020, the European Data Protection Board (EDPB) held its 30th plenary session via remote session. The session focused on the adoption of a letter responding to requests by various non-governmental organizations (NGOs) such as the Hungarian Civil Liberties Union concerning recent developments from the Hungarian government's decree regarding data privacy during the COVID-19 pandemic. On June 9, 2020, the EDPB held its 31st plenary session, in which it addressed a variety of issues including the establishment of a task force to address a video-sharing social media platform and to respond to various letters of inquiry from member nations.

At the June 2, 2020 meeting, the EDPB adopted a letter that examined when and how a state could pass legislation to restrict data subject rights provided by the General Data Protection Regulation (GDPR) to protect public health. The letter explains that the core of the GDPR's protections are data subject rights to access, deletion, and correction. It also stated that any restrictions placed on data subject rights related to a public health emergency must be "foreseeable for persons subject to them," and that restrictions that are not "precisely limited in time" do not meet that criterion. The letter specifically noted that "the mere existence of a pandemic" is not enough on its own to allow for a suspension of data subject rights.

The meeting on June 9, 2020, addressed several topics. First, the EDPB established a task force to coordinate potential actions and investigations of a video-sharing social media platform's data processing practices. In particular, the company's practices related to minors. The EDPB also addressed concerns with the use of Clearview AI, a facial recognition software, by law enforcement, noting that Law Enforcement Directive (EU) 2017/680 allows the processing of biometric data to identify a unique person only in accordance with that Directive. The EDPB stated that it "has doubts," based on current information available to it, that Clearview AI is consistent with the EU data protection regime. Finally, the EDPB appointed a new representative to the EU Agency for Cybersecurity and responded to a letter regarding cooperation between Supervisory Authorities to indicate that the EDPB is working to help ensure consistency of procedures across member states.


[1] 15 U.S.C. § 1681g(e).