October 2020

The Download October 2020

19 min

Developments in Law and Policy from Venable's eCommerce, Privacy, and Cybersecurity Group

In this issue, we highlight the Senate Commerce Committee’s introduction of Section 230 reform and data privacy legislation. We also cover a Senate Commerce Committee hearing on a potential federal privacy framework. In the House, we outline a bipartisan digital identity bill. We explore a Federal Trade Commission (FTC) workshop on data portability. In the states, we examine Portland City Council’s facial recognition ban. Across the pond, we highlight a European Data Protection Board (EDPB) blog post on artificial intelligence and the United Kingdom Information Commissioner’s Office guidance on collecting customer data. In the ongoing saga of the Schrems II ruling, we explore a joint white paper authored by various U.S. agencies.

Heard on the Hill

Senate Commerce Committee Members Introduce Section 230 Legislation

On September 8, 2020, the Senate Committee on Commerce, Science, and Transportation (Committee) announced that Committee Chairman Roger Wicker (R-MS), along with Sens. Lindsay Graham (R-SC) and Marsha Blackburn (R-TN), introduced S. 4534, the Online Freedom and Viewpoint Diversity Act (bill).

The bill would amend Section 230 of the Communications Decency Act (Act), which provides liability protection for “provider[s] or user[s] of interactive computer services,” including social media platforms, that engage in “good faith blocking or screening of offensive material.” According to the press release announcing the bill’s introduction, the bill is intended to “increase accountability for content moderation practices.” Chairman Wicker’s statement alleged that “social media platforms have hidden behind Section 230 protections to censor content that deviates from their beliefs.” Currently, the Act shields providers or users of interactive computer services from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable[.]”

The bill would amend the Act to apply only when an interactive computer service “has an objectively reasonable belief” that material is offensive as described in the Act. In addition, the bill would replace the term “otherwise objectionable” with more limited terms: “promoting self-harm, promoting terrorism, or unlawful[.]” The bill would also update the definition of “information content provider” to include “any instance in which a person or entity editorializes or affirmatively and substantively modifies the content of another person or entity.”

No co-sponsors have been added since the bill was introduced.

Bipartisan Digital Identity Bill Introduced in House

On September 11, 2020, Reps. Bill Foster (D-IL), John Katko (R-NY), Jim Langevin (D-RI), and Barry Loudermilk (R-GA) introduced H.R. 8215, the Improving Digital Identity Act of 2020 (Act), in the U.S. House of Representatives. Citing growing incidents of identity theft, identity fraud, and data breaches, the Act’s “findings” section notes that there is currently a lack of a reliable way for government agencies, organizations, and businesses to verify individual identities in the digital space. The bill states that government entities are uniquely positioned to address deficiencies in digital identity infrastructure since they are primary issuers of commonly used identity documents. The bill also highlights that the private sector drives innovation around digital identity and has a key role to play in delivering digital identity solutions.

The bill contains three central components that its main sponsor, Representative Foster, said would “ensur[e] the United States catches up with the developed world on digital identity.” Those main components are:

  • Creating the “Improving Digital Identity Task Force.” The Act states it would establish a task force in the Executive Office of the President to develop secure methods for all levels of government to validate identity attributes, protect individual privacy and security, and support interoperable and reliable digital identity verification in the public and private sectors. Members of the task force would come from federal, state, and local governments. Among other responsibilities, the bill would charge the task force with duties to: (i) assess restrictions on government agencies’ abilities to verify identity information; (ii) seek input from the private sector to the extent practicable; and (iii) evaluate risks related to potential criminal exploitation of digital identity verification services.
  • Instructing the National Institute of Standards and Technology (NIST) to establish identity standards. The Act would require NIST to establish a standards framework that considers privacy, security, and the needs of end users to guide the government in its provision of services related to digital identity verification. The Act states that within 240 days of its enactment, the director of NIST must publish an interim version of the standards framework, and a final version of the framework must be published within one year of the bill’s enactment.
  • Forming a grant program for states to upgrade identity systems. The Act instructs the Secretary of Homeland Security (Secretary) to award grants to states to update their systems for providing drivers’ licenses and other identity credentials.

In addition to the three main components listed above, the bill would instruct the Secretary to require federal agencies to implement certain identity-related guidelines and submit reports, the latter of which would be summarized by the Secretary and delivered to Congress. The bill would also require the Comptroller General of the United States to issue a report on nongovernmental organizations’ use of Social Security numbers and provide recommendations surrounding such use. On September 11, 2020, the Act was referred to the House committees on Oversight and Reform; Science, Space, and Technology; and Ways and Means for those committees’ consideration of provisions of the bill that fall within their respective jurisdictions.

Senate Commerce Committee Republicans Introduce Data Privacy Legislation

On September 17, 2020, Senator Roger Wicker (R-MS), Chairman of the Senate Committee on Commerce, Science, and Transportation, introduced the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act (SAFE DATA Act or the Bill) along with co-sponsors Sens. John Thune (R-SD), Deb Fischer (R-NE), and Marsha Blackburn (R-TN). The Bill is a modified version of the discussion draft Senator Wicker released in 2019 and includes elements of the Filter Bubble Transparency Act, which was introduced by Senator Thune, and the DETOUR Act, which was introduced by Senator Fischer with Senator Mark Warner (D-VA) and which focused on preventing deceptive user interfaces online.

The Bill would set a nationwide standard, preempting state laws that regulate data security and privacy. Among other requirements, the Bill would provide consumers with the right to request that covered entities allow access to, correct, delete, or port covered information the entity maintains about them. Additionally, the Bill would require affirmative express consent for covered entities to transfer or process “sensitive” information, which includes data types such as persistent identifiers (e.g., cookie identifiers and unique device identifiers) as well as geolocation information and health information. With regard to “filter bubbles,” the Bill would require additional notice and an opt-out choice when online platforms use algorithms to organize and determine what content to show a user. The requirements of the Bill would be enforceable by the Federal Trade Commission (FTC) and by state attorneys general. The FTC would also develop various new regulations to implement the requirements and would be able to obtain civil penalties for violations.

Given the upcoming election and legislative schedule, it is unclear if the Bill will receive a hearing or vote prior to 2021. The Bill does, however, indicate the direction that negotiations may take in the next Congress as lawmakers on both sides of the aisle indicate that they plan to continue to work toward a national data privacy law.

Senate Commerce Committee Hosts Data Privacy Hearing

On September 23, 2020, the Senate Committee on Commerce, Science, and Transportation (Committee) held a hearing entitled “Revisiting the Need for Federal Data Privacy Legislation.” Witnesses included former commissioners and chairmen of the Federal Trade Commission (FTC) and the current California Attorney General. This hearing follows the Committee’s hearing on federal data privacy proposals from December 4, 2019.

Committee members and witnesses addressed potential federal data privacy frameworks, preemption, private right of action, data ownership, and children’s privacy protections. Witnesses and Committee members agreed on the importance of passing a federal privacy law and discussed issues that may be considered in forming such a privacy framework, including opt-in and opt-out models for data transfers, terms of service and privacy policies, and the FTC’s enforcement ability. Sen. Marsha Blackburn (R-TN) emphasized the importance of clarifying the question of data ownership to protect consumer privacy. Several Committee members and witnesses addressed preemption and private right of action clauses, expressing a range of opinions on the inclusion of such provisions in a federal framework.

Several Committee members discussed Chairman Roger Wicker’s (R-MS) recent bill S. 4626, the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act (SAFE DATA Act). Ranking Member Maria Cantwell (D-WA) also discussed her bill, S. 2968, the Consumer Online Privacy Rights Act (COPRA).

Members and witnesses also discussed the California Consumer Privacy Act (CCPA) and the European Union’s General Data Protection Regulation (GDPR). A Committee member expressed concern regarding the cost of compliance for small businesses under the CCPA and the GDPR, though California’s Attorney General replied that compliance costs have not driven companies out of the state. Another witness stated that the GDPR has led small businesses to exit the market.

Chairman Wicker noted that he plans to convene a hearing focused on the recent Court of Justice of the European Union decision in Schrems II and the related EU-U.S. data transfers before the end of 2020.

Around the Agencies and Executive Branch

Federal Trade Commission Hosts Data Portability Workshop

On September 22, 2020, the Federal Trade Commission (FTC) held a workshop entitled “Data To Go: An FTC Workshop on Data Portability.” The workshop, which brought together a range of stakeholders including FTC officials, industry members, representatives of consumer advocacy organizations, and economists, examined the potential benefits and challenges to consumers and competition raised by data portability, which can be defined as the ability of consumers to move personal information like emails, contacts, financial information, or health information from one service to another or to obtain it themselves.

The FTC workshop included a series of panels that addressed data portability case studies and policy issues. The first panel reviewed data portability initiatives in the European Union (EU), California, and India. The panelists noted that both the EU General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) establish the right to data portability but that India does not yet have a broadly applicable privacy law that creates such a right. One panelist who spoke about GDPR observed that it remains “unclear” whether increased data portability has fostered innovation in European markets.

The second panel focused on financial and health portability regimes. In the financial industry, one panelist said, nearly all major banks have adopted “open banking,” a secure system that allows third parties access to financial account information to provide products and services — a development that signals the rise of data portability in the financial sector. Another panelist expressed concerns about the “patchwork” of privacy laws in the United States and claimed that the lack of a comprehensive national framework hurts both consumers and small businesses. Among the topics discussed in the health portability context was the importance of identity authentication in a data-sharing environment that involves patient health information.

The third and fourth panels focused on various policy issues surrounding data portability. During the third panel, on reconciling the benefits and risks of data portability, one panelist noted that the goals of data portability are to give consumers access to and control over data and to encourage competition. At the same time, the panelist remarked, it remains unclear whether portability encourages competition because consumers may be disinclined to transfer their data. Another panelist stated that a non-portability system could have negative effects in that it may present consumers or small businesses with “high switching costs” that would make it difficult to transfer their data from one system to another if they so desired.

The fourth and final panel discussed data portability’s potential to promote privacy. One panelist asserted that future technologies will require people to share more data and that finding a secure method of data portability is therefore necessary. Another panelist reiterated the need for providing data portability.

In the States

Portland City Council Bans Private Sector Use of Facial Recognition Technology

On September 9, 2020, the Portland City (City) Council passed ordinances to prohibit the use of facial recognition technology by city bureaus and private entities in places of “public accommodation,” defined as “any place or service offering to the public accommodations, advantages, facilities, or privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise.” Both ordinances take effect on January 1, 2021. The ordinances define “face recognition technology” as automated or semi-automated processes using face recognition that assists in identifying, verifying, detecting, or characterizing facial features of an individual or capturing information about an individual based on an individual’s face.

The ordinance prohibiting the acquisition and use of face recognition technologies by city bureaus requires each city bureau director to review and assess whether bureau staff are using face recognition technologies. The ordinance also prohibits bureaus from acquiring, evaluating, and using face recognition technologies, as well as acquiring, requesting, using, accessing, or retaining information derived from face recognition technologies, unless used for: (1) verification purposes for bureau staff to access personal or City-issued personal communication; (2) automatic face detection services in social media applications; or (3) detecting faces for the sole purpose of redacting a recording for release. The ordinance will remain in effect until the City adopts or revises a comprehensive data governance and privacy information protection framework that addresses face recognition technologies.

The ordinance prohibiting the use of facial recognition technology for private entities states that Portland residents should use public spaces with a “reasonable assumption” of anonymity and personal privacy. The ordinance also states that face recognition technologies have been documented to have an “unacceptable” gender and racial bias. Under the Ordinance, the Bureau of Planning and Sustainability, and the Office of Equity and Human Rights, will: (1) develop a plan to create public awareness on impacts and uses of face recognition technologies; (2) promote “digital rights” including privacy and information protection regarding the collection of information by face recognition technologies; and (3) coordinate public participation including the development of a “comprehensive surveillance technologies” policy. A person injured by a violation of the ordinance may institute proceedings against the City after a 30-day cure period.

International

European Data Protection Supervisor Posts Blog on Artificial Intelligence

On September 7, 2020, Wojciech Wiewiórowski, the European Data Protection Supervisor, published a blog post titled “Artificial Intelligence, data and our values – on the path to the EU’s digital future.” In his post, Mr. Wiewiórowski emphasized the need for a European Union (EU) regulatory framework that addresses the human and ethical implications of artificial intelligence (AI) and provided observations of the European Data Protection Supervisor on the European Commission’s two public consultations, “A European strategy for data” (the Data Strategy) and its “White Paper on Artificial Intelligence – A European approach to excellence and trust” (the White Paper).

On the European approach to AI, Mr. Wiewiórowski stated the importance of a “coherent approach throughout the Union: any new regulatory framework for AI should be the same for both EU Member States and Institutions, offices, bodies and agencies,” and cautioned against the rapid adoption of AI without careful consideration. He noted that AI is “not an innocuous, magical tool” and that the “benefits, costs and risks should be considered by anyone adopting a technology, especially by public administrations who process great amounts of personal data.”

Mr. Wiewiórowski further noted that one of the objectives of the Data Strategy, which is intended as a roadmap for policy measures and investments for the European data economy, should be to prove the viability and sustainability of an alternative data economy model. He highlighted that under an equitable and sustainable approach, the European data space could provide a medium “through which individuals are empowered to share their data, while benefiting from a more transparent overview of their data’s multiple use[s].” In light of the unprecedented global crisis caused by the COVID-19 pandemic, Mr. Wiewiórowski emphasized the role that data and technology could play in combating the virus and other matters of public interest. However, in relation to the concepts of “data altruism” in the Data Strategy, Mr. Wiewiórowski noted the “substantial persuasive power in the narratives nudging individuals to ‘volunteer’ their data to address highly moral goals” and invited the Commission to better define the scope of such “data altruism.”

Swiss Data Protection and Information Commission Rules Swiss-U.S. Privacy Shield As Inadequate

On September 8, 2020, the Swiss Federal Data Protection and Information Commissioner (FDPIC) determined that the Swiss-U.S. Privacy Shield does not provide an adequate level of protection for data transfers from Switzerland to the United States (U.S.) under Switzerland’s Federal Act on Data Protection (FADP).

In its opinion, the FDPIC cited the Court of Justice of the European Union’s (CJEU) July 2020 decision in Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (Schrems II). In Schrems II, the CJEU invalidated the EU-U.S. Privacy Shield, determining that the U.S. did not provide an adequate level of protection for data transfers between the European Union (EU) and the U.S. On the day of the Schrems II decision, the FDPIC issued a statement explaining that it would review the CJEU’s judgment and comment on it shortly.

The FDPIC noted that while Switzerland is not bound by the CJEU, it reviewed the reasoning outlined in the Schrems II decision. After reviewing the CJEU decision, the FDPIC concluded that U.S. law does not provide Swiss citizens with meaningful rights of redress or remedy for U.S. government access to personal data. In addition, the FDPIC stated that there is a lack of transparency related to the decision-making and independence of the ombudsman mechanism, making it impossible for FDPIC to evaluate its effectiveness. The Privacy Shield Ombudsman is responsible for handling requests from Swiss and EU citizens related to U.S. government access to data for national security purposes. As a result, the FDPIC found that the Swiss-U.S. Privacy Shield does not provide adequate protection for the transfer of personal data to the U.S. pursuant to the FADP.

In addition, the FDPIC determined that standard contractual clauses (SCCs) or binding corporate rules (BCRs) may not provide adequate protection for transfers to countries Switzerland does not recognize as adequate, including the U.S. In light of this, the FDPIC recommends that Swiss companies:

  • Conduct a case-by-case evaluation of data transfers that rely on SCCs or BCRs and review those contracts as needed;
  • Consider whether the recipient country can provide the “cooperation necessary” to enforce Swiss data protection principles; and
  • If a foreign company cannot cooperate, the Swiss entity must evaluate technical measures to prevent government authorities in the recipient country from accessing transferred personal data.

The FDPIC concluded by stating that he will provide further guidance on data transfer mechanisms as soon as additional information is available.

United Kingdom’s Information Commissioner’s Office Issues Data Protection Guidance for Collecting Customer Information

Across the United Kingdom (UK), businesses in certain sectors — such as the hospitality sectors, the leisure and tourism sector, and “close contact” businesses like salons and barbers — have been required to collect contact information for COVID-19 contact tracing programs. Specifically, England, Northern Ireland, Scotland, and Wales have each released government guidance that requires certain businesses to collect customer, visitor, and staff contact information for contact tracing purposes.

According to the UK Information Commissioner’s Office (ICO), businesses subject to these requirements may have limited experience with collecting and retaining personal information. To support such businesses, the ICO issued data protection guidance (guidance) on September 18, 2020. The guidance outlines five steps for businesses to take to ensure that businesses are “collecting customer information securely and complying with data protection law.” Per the ICO’s guidance, these steps include:

  1. Asking customers only for the specific information that has been set out in government guidance, such as name, contact details, and the time of the customer’s arrival;
  2. Being transparent with customers about what the business is doing with personal information about the customer;
  3. Carefully storing the data collected;
  4. Refraining from using the personal information collected for contact tracing efforts for other purposes, such as direct marketing, profiling or data analytics; and
  5. Erasing or disposing of the personal information collected consistent with government guidance.

The newly released guidance is the latest in the line of guidance published by the ICO regarding the collection, storage, sharing, and deletion of personal information for contact tracing efforts. Businesses required to collect personal information for contact tracing purposes can visit the ICO’s “data protection and coronavirus information hub” for additional information.

U.S. Government Publishes White Paper on Data Transfers After Schrems II

On September 28, 2020, the United States (U.S.) Department of Commerce, the U.S. Department of Justice, and the Office of the Director of National Intelligence jointly published a white paper titled “Information on U.S. Privacy Safeguards Relevant to SCCs and Other EU Legal Bases for EU-U.S. Data Transfers After Schrems II” (White Paper). The White Paper addresses privacy safeguards related to U.S. intelligence agencies’ access to data.

The White Paper was published in response to the Court of Justice of the European Union’s (CJEU) July 16, 2020 decision in Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (Schrems II). In Schrems II, the CJEU invalidated the EU-U.S. Privacy Shield (Privacy Shield), citing concerns about the data surveillance practices of the U.S. government. The decision also upheld the validity of standard contractual clauses (SCCs) as a mechanism for the transfer of data outside of Europe if the law in the recipient country provides adequate protection of personal data. The CJEU instructed companies using SCCs to independently evaluate the surveillance practices of the government in a recipient country. As a result, companies that transfer data to the U.S. using SCCs must conduct an analysis of U.S. data practices, including intelligence agencies’ access to data, prior to transferring personal data to the U.S.

In a cover letter accompanying the White Paper, James M. Sullivan, Deputy Assistant Secretary for Services in the U.S. Department of Commerce, explained that the White Paper was intended to outline “the robust limits and safeguards in the U.S. pertaining to government access to data to assist organizations in assessing whether their transfers offer appropriate data protection in accordance with the ECJ’s ruling.” The letter emphasized the importance of the $7.1 trillion economic relationship between the European Union and the U.S.

The White Paper provided several key takeaways regarding U.S. law. First, the White Paper explained that most U.S. companies do not handle data that is of interest to U.S. intelligence agencies. In the few instances when an intelligence agency wishes to obtain information from an entity, it must go through a judicial process pursuant to Section 702 of the Foreign Intelligence Surveillance Act (FISA 702) to order companies to do so. Most companies doing business in the European Union have not received such a request. As a result, the privacy risks outlined in Schrems II are just a “theoretical possibility.”

Next, the White Paper explained that those companies that have received orders to disclose information to U.S. intelligence agencies should consider whether the “public interest” derogation in Article 49 of the General Data Protection Regulation (GDPR) applies. According to the White Paper, U.S. intelligence agencies share intelligence information with European Union member states on a regular basis for public interest purposes – such as dealing with security threats – which “undoubtedly serves important EU public interests.”

Finally, in striking down the Privacy Shield, the White Paper said the CJEU neglected to review developments in U.S. law since 2016. The White Paper detailed the protections provided by the two sources of U.S. intelligence law on which the CJEU relied, FISA 702 and Executive Order 12,333. The White Paper concluded by stating that the safeguards provided by these provisions, in addition to other U.S. privacy laws, ensure that U.S. intelligence agencies’ access to data is appropriately limited, subject to rigorous oversight, and offers remedies for violations of data subjects’ rights.