In this issue, we cover hearings held by the House Committee on Oversight and Reform and the House Financial Services Committee. We discuss the Federal Trade Commission's workshop on voice cloning technology. In the states, we address the California Attorney General's modifications to its proposed California Consumer Privacy Act regulations, the Washington State Legislature's hearing on the Washington Privacy Act, and New York's updates to telemarketing laws. Overseas, we explore draft guidance on cookies and other trackers released by the French Data Protection Authority and updates from the United Kingdom's Information Commissioner's Office.
Heard on the Hill
House Oversight Committee Holds Facial Recognition Technology Hearing
On January 15, 2020, the House Committee on Oversight and Reform (Committee) held the third hearing of its series on "Facial Recognition Technology." The hearing focused on "Ensuring Commercial Transparency & Accuracy." The hearing's witnesses included representatives from (1) industry; (2) federal agencies; and (3) academia. During the hearing, participants touched upon topics such as (1) data privacy, security, and ownership; (2) facial recognition technology accuracy and best practices; (3) potential legislation regarding facial recognition and other emerging technologies; and (4) the impact of facial recognition technology on different demographic groups.
During opening statements, Committee Chairwoman Carolyn Maloney (D-NY), Committee Ranking Member Jim Jordan (R-OH), Rep. Jimmy Gomez (D-CA), and Rep. Mark Meadows (R-NC) called for the development of federal facial recognition legislation. Ranking Member Jordan spoke of a "patchwork" of facial recognition laws. Chairwoman Maloney and Rep. Meadows emphasized that "the right to privacy" must be considered when examining potential legislation.
While discussing potential facial recognition legislation at the federal level, Rep. Eleanor Holmes Norton (D-DC) and Rep. Brenda Lawrence (D-MI) expressed interest in enabling consumers to see the data storage and collection processes of facial recognition companies. Rep. Holmes Norton and Rep. Stephen Lynch (D-MA) expressed concern with data security practices of facial recognition companies. Rep. Mark DeSaulnier (D-CA) stated that consumers should have data ownership rights. In response to a question posed by Rep. Virginia Foxx (R-NC) about data security best practices, an industry representative stated that many companies are complying with the European Union's (EU) General Data Protection Regulation (GDPR) data security and encryption requirements. Chairwoman Maloney, Ranking Member Jordan, Rep. Gomez, Rep. Robin Kelly (D-IL), Rep. Lawrence, Rep. Alexandria Ocasio-Cortez (D-NY), Rep. Rashida Tlaib (D-MI), and Rep. Ayanna Pressley (D-MA) expressed concern with potential biases in facial recognition technology.
At the hearing, industry representatives also expressed support for federal data privacy legislation. A witness outlined what they believed should be included in the legislation (1) exclusion of a private right of action clause and mandatory opt-in consent; (2) preemption; and (3) consumer data privacy rights. An industry representative stated that federal data privacy legislation should be technology-neutral. A witness from industry added that if there is a separate federal facial recognition technology law, it should include affirmative opt-in consent. An academic expressed concern with the "mass collection" of biometric data with facial recognition technology, adding that it was not transparent.
House Financial Services Task Forces Hold Hearings on Mobile Payments and Algorithmic Bias
On January 30, 2020, the House Committee on Financial Services (Committee) Task Force on Financial Technology (Fintech Task Force) held a hearing entitled, "Is Cash Still King? Reviewing the Rise of Mobile Payments." On February 12, 2020, the Committee's Task Force on Artificial Intelligence (AI Task Force) convened a hearing on "Equitable Algorithms: Examining Ways to Reduce AI Bias in Financial Services."
The mobile payments hearing was the Fintech Task Force's first hearing of 2020 and featured discussion regarding the development and use of mobile payment technology. Specifically, hearing participants addressed financial data privacy, financial data security, and efforts to promote inclusivity for consumers' and businesses' increasing use of mobile payment methods. Speaking about financial entities' collection of data about consumers, Fintech Task Force Chairman Stephen Lynch (D-MA) cautioned that the banking industry "may go the way of the Internet" regarding privacy violations. In response to Chairman Lynch, a witness noted that a potential federal data privacy law should include limits on the collection and use of consumer financial data.
The hearing on algorithms was also the AI Task Force's first 2020 hearing. Among other topics, AI Task Force Members and witnesses touched on how concepts of fairness should be incorporated into artificial intelligence (AI), machine learning (ML), and other algorithmic processes; how advertisers use algorithms to serve ads; and alternative data use in credit reporting. Several AI Task Force Members and witnesses said stakeholders must establish an agreed-upon definition of what constitutes "fairness" regarding the use of AI and algorithms.
Around the Agencies and the Executive Branch
Federal Trade Commission Convenes Workshop on Voice Cloning Technology
On January 28, 2020, the Federal Trade Commission (FTC) held a workshop entitled "You Don't Say: An FTC Workshop on Voice Cloning Technologies." The workshop examined voice cloning technologies that create the ability to reproduce a real person's voice with nearly perfect accuracy. During the workshop, presenters and panelists examined privacy and data security issues surrounding voice cloning technologies, artificial intelligence (AI) related to voice cloning technology, and ethical considerations related to emerging voice cloning technologies, among other topics.
Presenters and panelists included individuals from federal agencies including the FTC and the Department of Justice (DOJ); individuals from industries including technology, entertainment, and healthcare; and individuals from academic institutions.
FTC Commissioner Rohit Chopra gave the opening remarks. He emphasized the potential need for regulations and oversight to prevent the abuse of technologies that may be used to falsify biometric data. Commissioner Chopra noted that the abuse of such technologies could have implications both for consumers and for national security.
Presentations and panel discussions covered the following topics, among others:
- The State of Voice Cloning Technology and its Positives and Negatives. A presenter from a university discussed the advancement of voice cloning technology and noted that increasingly sophisticated AI has made it easier to develop a convincing synthetic voice from fewer recorded samples. Panelists discussed applications of voice cloning technology in healthcare and discussed the importance of industry collaboration to prevent what the panelists described as potential misuse of such technology for purposes such as robocalls and phishing schemes. Several panelists expressed support for exploring "watermark" systems for easy identification of synthetic voices.
- Ethics of Voice Cloning. Panelists expressed support for incorporating societal considerations in the development of emerging technologies. Panelists stated that companies should be expected to abide by ethical standards in the development of systems that use AI, and two panelists also opined that developers should enable consumer controls after such technology has been deployed.
- Authentication, Detection, and Mitigation. Panelists noted that voice cloning technology will continue to improve, and that detection and authentication technology must keep up with the pace of development.
In closing remarks for the workshop, Laura DeMartino from the Bureau of Consumer Protection at the FTC observed that panelists had identified a number of beneficial uses and potential misuses of voice cloning technology and noted that it will be important for industry to design voice cloning systems ethically and with end-users in mind.
In the States
Washington State Senate Advances Sweeping Data Privacy Legislation
In Washington State, new privacy legislation has recently advanced through the state legislature. In January, state lawmakers in both houses introduced companion data privacy bills and held public hearings on the legislation. On February 7, 2020, House Bill (HB) 2742 passed the House Committee on Innovation, Technology, and Economic Development (ITED Committee) and was referred to the House Appropriations Committee, where the legislation remains. In the Washington State Senate, Senate Bill (SB) 6281 passed the Senate on February 14, 2020, and was referred to the House ITED Committee on February 17, 2020. The ITED Committee held a public hearing on the bill on February 21, 2020, and the Committee took up the bill again on February 26 in executive sessions.
As passed by the Washington State Senate, SB 6281, known as the Washington Privacy Act, applies to entities that conduct business in Washington or produce products or services that are targeted to Washington residents. To be covered by the law, entities would have to meet certain revenue or data thresholds. Nonprofits and institutions of higher education would be exempt from the law until July 31, 2024.
Notably, SB 6281 would give Washington consumers rights broader than those granted to California consumers by the California Consumer Privacy Act (CCPA). In addition to giving consumers the rights to access and delete their personal data and to opt out of sales of that data, SB 6281 would grant consumers the right to correct inaccurate data, the right to data portability, and the right to opt out of "profiling" and "targeting advertising" in addition to sales. SB 6281 would also go beyond the CCPA to require opt-in consent for the collection of "sensitive information," including data revealing racial or ethnic origin, biometric data, and specific geolocation data. SB 6281 contains facial recognition restrictions that would require companies (with certain narrow exceptions) to obtain affirmative opt-in consent from consumers before they are enrolled in a facial recognition system. Companies subject to SB 6281 would have to conduct data protection assessments as well as establish a process that allows consumers to appeal a denied request. SB 6281, which would take effect on July 31, 2021, provides that the Washington Attorney General shall have exclusive authority to enforce the law.
As the House ITED Committee continues to debate SB 6281, certain amendments are before the Committee that—if enacted into law—would further expand the rights of consumers for their data. For example, one amendment would create a private right of action with potential penalties. Another amendment would modify the definition of "sale" (requiring an ability to opt out) to include any processing of personal data rather than limiting the definition to a disclosure of personal data in exchange for monetary or other valuable consideration.
California AG's Office Modifies Its Proposed CCPA Regulations
On February 7, 2020, and again on February 10, 2020, the Office of the Attorney General of California (OAG) released modified proposed regulations (Modified Regulations) under the California Consumer Privacy Act (CCPA). The Modified Regulations update the OAG's initial proposed CCPA regulations, released on October 11, 2019. Modified Regulations include changes in response to public and industry comments regarding the initial proposed CCPA regulations.
Key changes in the Modified Regulations include:
- Guidance on Personal Information. The Modified Regulations attempt to clarify the definition of personal information. Specifically, determining whether information constitutes personal information (as defined within the CCPA), depends on whether the business maintains information in a manner that permits the business to link, or reasonably link, the information to a consumer or household. Notably, the Modified Regulations provide specific guidance on IP Addresses. In particular, the Modified Regulations state that if a business does not or cannot "reasonably link the IP Address with a particular consumer or household, then the IP address would not be 'personal information[.]'"
- Accessibility Standard. The Modified Regulations provide guidance on how businesses may make online notices and privacy policies available to consumers with disabilities. Such online notices and privacy policies must follow generally recognized industry standards, such as the Web Content Accessibility Guidelines, version 2.1 of June 5, 2018, from the World Wide Web Consortium.
- Service Provider Use of Personal Information. The Modified Regulations expand upon the purposes for which a service provider may retain, use, or disclose personal information in the course of providing services. This expansion includes permitting the service provider to use the personal information to employ service providers on its behalf and using the personal information internally, such as to build or improve the quality of its services provided the information is not used to build or modify household or consumer profiles, or clean or augment data acquired from another source.
- Notice at the Point of Collection. The Modified Regulations would permit businesses to provide oral notice to consumers if the business collects personal information from the consumer orally. Similarly, for personal information collected on a mobile device, businesses may provide notice on the mobile application download page and within the application's settings. However, if the mobile application collects personal information from the consumer that the consumer would not reasonably expect the business to collect, a business must provide a just-in-time notice that specifies the categories of personal information the business is collecting and link to the full notice.
- Complying with an Opt-Out Request. The Modified Regulations remove the requirement for businesses that sell consumer personal information to provide notice of a consumer's opt-out to all third parties to which the business sold the consumer's personal information within 90 days prior to the business' receipt of the opt-out. Now, a business would need to notify only third parties to which the business sold the consumer's personal in the time between receiving and complying with a consumer's opt-out request of the consumer's request to opt out and instruct them to not sell that consumer's information.
The OAG sought written comments to the Modified Regulations. The deadline to comment on the Modified Regulations was Tuesday, February 25, 2020. If further changes are made by the OAG, they must post the updated draft and provide at least 15 days for comment. Otherwise, the OAG must submit the rulemaking record to the Office of Administrative Law (OAL) for review, which may last up to 30 days. If approved by the OAL, the final regulation text will be filed with the Secretary of State. Regardless of when the regulation text is finalized, enforcement of the CCPA statute is expected to begin on July 1, 2020, as permitted under the CCPA.
Tougher New York Telemarketing Laws to Go Into Effect
New York recently enacted two amendments to Section 399-Z of the General Business Law (the Telemarketing Law) that will place new obligations on companies telemarketing to New York state residents.
The Nuisance Call Act, which goes into effect on March 1, 2020, amends the Telemarketing Law in two important ways. First, it requires telemarketers who place live-voice calls to inform customers that their number may be added to the company's "do-not-call" (DNC) list. This provision closes what Governor Cuomo deemed a "loophole" in the Telemarketing Law. As originally enacted, the Law provided that calls delivered by automated means (i.e., "robocalls") must include a message informing customers of their right to be added to the company's DNC list and a mechanism for customers to be automatically added to the list. The Law did not address live-voice calls.
Second, the Nuisance Call Act provides that telemarketers may not "transmit, share, or otherwise make available any customer's contact information" without the customer's express agreement. Notably, this provision is not limited to "sales" of contact information and does not exempt sharing information with service providers.
Under a separate act (A00117), and already in effect, the Telemarketing Law was also recently amended to prohibit telemarketers from making unsolicited sales calls to anyone in a location "under a declared state of emergency or disaster emergency."
French CNIL Issues Draft Guidance on Cookies and Other Trackers
On January 14, 2020, the French Data Protection Authority (CNIL) provided draft guidance (Draft Guidance) supplementing its July 2019 guidelines on cookies or similar tracking technologies (Cookie Guidelines). The Cookie Guidelines aligned the CNIL's cookie policies with the General Data Protection Regulation's (GDPR) requirements on consent. The new, supplementary Draft Guidance has the stated goal of supporting professionals implementing compliant solutions for collecting consent by providing practical recommendations. Specifically, the purpose of the Draft Guidance, as explained in the text, is to (1) describe the practical modalities for obtaining required consent; (2) propose concrete examples of the user interface; and (3) present best practices.
Notable provisions of the Draft Guidance include, but are not limited to:
- Exempt Trackers. The Draft Guidance clarifies that various "trackers" can be regarded as exempt from the consent requirements. For example, the following trackers can be regarded as exempt: trackers designed to keep track of content in a shopping cart on a merchant site, trackers designed for authentication, and trackers maintaining the choice expressed by the user regarding the use of trackers. However, when a tracker is used for another purpose that requires consent, such as if a shopping cart cookie is also used for advertising, the tracker is no longer exempt.
- Information on Purposes of Trackers. The purpose of trackers must be presented to users in a clear and intelligible manner prior to users consenting. The Draft Guidance provides example language that would address the purpose of various trackers in a compliant manner. The Draft Guidance also recommends providing more detailed information about the purposes in an easily accessible way.
- Identity of the Controller and Scope of Consent. As users must be able to discover the identity of all controllers prior to consenting, the Draft Guidance suggests that an exhaustive, regularly updated list of controllers be made available to users in a permanently and easily accessible manner.
- Requirements for Free Consent. The Draft Guidance suggests that various criteria must be satisfied for consent to be valid and freely given. For instance, users should be offered both the possibility of accepting or refusing cookies; it should be as easy to consent as to refuse consent; a user should not suffer prejudice by refusing to consent; refusal should be registered for as long as consent is registered; and website interfaces should not be misleading in a way that may lead users to think consent is required to continue browsing.
The CNIL sought public consultation on the Draft Guidance for six weeks following the release of the Draft Guidance. The period for consultation ended on February 25, 2020. A final version of the Draft Guidance will be presented to members of the CNIL meeting in plenary session for final adoption. As plenary sessions occur weekly, the final version of the Draft Guidance is likely to be adopted soon.
Updates from the United Kingdom's Information Commissioner's Office
Last month, the United Kingdom's (UK) Information Commissioner's Office (ICO) addressed several privacy-related issues related to Adtech, Brexit, and children's online privacy.
On January 21, 2020, the UK ICO published its "Age Appropriate Design Code" (Code), a set of 15 standards to guide online services to protect children's privacy. The Code comes after nearly a year of consultation with trade bodies, industry representatives, and individual organizations. The Code applies to companies that design, develop, or offer online services that are likely to be used by children under the age of 18. The Code is wide-ranging in scope and includes:
- A requirement that companies primarily consider the "best interests of the child" when developing online services that are accessible to children;
- The use of a Data Protection Impact Assessment to "assess and mitigate" risks to the rights of children who are likely to access the services;
- The use of age-appropriate application, which requires companies to take into account the different needs of children at different stages of development when designing the service;
- A limitation on profiling, which includes the use of privacy settings and the use of appropriate measures to safeguard children; and
- A prohibition on the use of nudge techniques to encourage children to turn off privacy protections.
The Code will come into force approximately two months after being laid before the UK Parliament. Companies will be given a 12-month grace period to conform to the standards and the ICO predicts that the grace period will end by autumn 2021.
On January 17, 2020, Simon McDougall, the Executive Director of Technology and Innovation for the ICO, published a blog post titled "Adtech – the Reform of Real Time Bidding Has Started and Will Continue." In his post, Mr. McDougall emphasized the need for greater transparency and collective reform by Adtech companies involved in real-time bidding. While Mr. McDougall praised some members in industry for taking affirmative steps to create guidelines for "security, data minimization, and data retention," he warned that the ICO could use "its wider powers" to ensure that non-compliant organizations take steps to reform.
Finally, on January 29, 2020, the ICO issued a statement titled "Statement on Data Protection and Brexit Implementation – What You Need to Do" to assist organizations as they assess their data protection programs in the wake of Brexit. The statement noted that the UK would leave the European Union effective January 31, 2020, and that the "Brexit transition period" would last until the end of December 2020. The ICO encourages businesses to follow the ICO's existing guidance and materials to manage their data protection obligations.