Blog
Data, Privacy + Cybersecurity Insights
August 18, 2022

The FTC’s Privacy Rulemaking: Risks and Opportunities

Authors

If businesses thought August would bring a lull from the breakneck pace of privacy policymaking — including a draft federal privacy law, five new state privacy laws and a dizzying array of global privacy regulations from the EU to China to Japan — the U.S. Federal Trade Commission had other plans. Announcing an ambitious Advance Notice of Proposed Rulemaking, the FTC has put in motion a process that privacy advocates have dreamt about for years, promulgating trade regulations in privacy and data security. While rulemaking is a laudable initiative, offering businesses more predictability and consumers stronger protections, it would be more effective if it were narrowly tailored to a manageable set of goals. Otherwise, the result might not only fail to enhance the agency’s stature, but also risks setting it back.

The scope of the ANPRM is incredibly broad, including the totality of privacy and data security issues ranging from biometrics and artificial intelligence to targeted advertising and protection of employees. The agency asks the public to comment on 95 questions, spanning practically any issue imaginable in this broad space, which cuts across industries, business models and technologies. FTC scholars know that for decades, the agency’s construction of its privacy and data security authority was circumspect and incremental, proceeding with utmost caution on a case-by-case basis, seldom if ever announcing a departure from previous practice. Against this backdrop, the sweeping scope of the ANPRM is a jolt that seems out of character.

The initiative should be viewed in the context of recent setbacks to the FTC’s authority as an enforcement agency. Last year, in AMG Capital Management v. FTC, the Supreme Court invalidated the FTC’s use of Section 13(b) of the FTC Act to seek monetary relief, including disgorgement and restitution. Commissioner Rebecca Kelly Slaughter, the acting Chairwoman at the time, said that the Court had “deprived the FTC of the strongest tool we had to help consumers” thereby rewarding “scam artists and dishonest corporations.”

In 2018, in LabMD v. FTC, the 11th Circuit Court of Appeals held that the FTC’s order mandating a complete overhaul of the company’s data security program was unenforceable as it was overly broad and did not enjoin a specific act or practice.

To reconstitute its power to issue civil penalties, the FTC now seeks to use its authority under Section 18 of the FTC Act, to set forth rules “which define with specificity acts or practices which are unfair or deceptive acts or practices in or affecting commerce.” With such rules in place, the agency will once again be authorized to impose civil penalties against first-time offenders. Such rules will also provide much needed clarity to market participants, avoiding the criticism repeatedly levied against the FTC by companies such as LabMD and Wyndham Hotels (in FTC v. Wyndham), who argued the agency is moving the goalposts in its enforcement actions, punishing companies for violating rules that have neither been clearly articulated nor publicly announced.

Looming over the rulemaking initiative is the specter of the Supreme Court’s recent decision in West Virginia v. Environmental Protection Agency. In that case, the Supreme Court struck down an EPA rule, noting that regulatory agencies could not issue rules on “major questions” affecting “a significant portion of the American economy” without “clear congressional authorization.” In his concurring opinion, Justice Neil Gorsuch wrote, “When Congress seems slow to solve problems, it may be only natural that those in the Executive Branch might seek to take matters into their own hands. But the Constitution does not authorize agencies to use. … regulations as substitutes for laws.”

To be sure, the EPA decision concerned rulemaking under the Administrative Procedure Act whereas the FTC acts under its express grant of authority in Section 18 of the FTC Act. But even if courts draw this distinction, the general judicial climate — in the D.C. Circuit and the U.S. Supreme Court — is unmistakably suspect of expansive regulatory interpretations. And while the Congressional grant of authority in Section 18 is clear, the legislative hook for the rulemaking, i.e., the language in Section 5 of the FTC Act, is anything but. Even with the best of intentions, reading the ANPRM’s 95 questions into the phrase ”unfair or deceptive acts or practices” seems like a stretch.

To succeed in its endeavor, the FTC should rein in the process and focus on results that are attainable. Here are a few ideas:

Codify the common law of FTC enforcement actions. To counter the LabMD and Wyndham type arguments, the FTC could codify its existing body of enforcement actions, which Dan Solove and Woody Hartzog called a “common law of privacy.” Unlike a real common law, the FTC cases have seldom been litigated or ruled on by courts. Instead, to avoid costly litigation and uncertain results, the agency and dozens of companies settled cases and agreed upon remediation and penalties.

Sifting through these cases, Solove, Hartzog and later scholars uncovered a set of privacy and data security principles, such as avoiding deceitful data collection, unfair design or default settings, unfair data security practices, or retroactive changes to privacy policies; or implementing comprehensive privacy and security programs. While indicative of the FTC’s intentions, these cases often left companies wondering where the line is drawn between practices that the FTC punished as violations in specific cases and those that are considered good enough.

In other words, companies could tell what not to do — that is, repeat the wrongdoings alleged in a complaint — but not what they are expected to do. With rulemaking, the FTC can clearly delineate that line, rebutting the LabMD/Wyndham critique.

Expand into areas where it stands on firm ground. Two areas where FTC authority has seldom been questioned and are ripe for new rules are dark patterns and data security.

Dark patterns. The “deception” prong of the FTC’s Section 5 authority has always been easier to implement than “unfairness.” Under Section 5, a representation, omission or practice is deceptive if it is likely to mislead consumers acting reasonably under the circumstances and is material to consumers — that is, it would likely affect the consumer’s conduct or decision with regard to a product or service.

Last year, the FTC held a public workshop to explore dark patterns, a range of manipulative user interface designs used on websites and mobile apps with the effect of obscuring, subverting, or impairing consumer autonomy, decision making, or choice. As the ANPRM notes, the FTC has already pursued enforcement actions targeting “several pernicious dark pattern practices, including burying privacy settings behind multiple layers of the user interface and making misleading representations to ‘trick or trap’ consumers into providing personal information.” This area is ripe for rulemaking to set the contours of deceptive trade practices in UX design.

Data security. In dozens of enforcement actions, statements and reports, the FTC has built a track record of data security best practices. But that record was built case by case and is sometimes difficult to capture as a whole. Several years ago, I wrote that “Like a group of blind men encountering an elephant … so do commentators, lawyers and industry players struggle to identify what ‘reasonable data security’ practices mean in the eyes of the FTC.” Through a rulemaking process, the FTC can articulate the ground rules for what it views as reasonable data security measures.

Not reach out to uncharted territory. Some of the questions in the ANPRM, suggest the agency explore areas which have seldom been discussed or agreed upon among policymakers. Consider question 39:

To what extent, if at all, should the Commission limit companies that provide any specifically enumerated services (e.g., finance, healthcare, search, or social media) from owning or operating a business that engages in any specific commercial surveillance practices like personalized or targeted advertising.

It is unclear what the basis is for such assertions, particularly given that “commercial surveillance practices” are so broadly defined in the ANPRM.

Or question 75:

To what extent does current law prohibit commercial surveillance practices, irrespective of whether consumers consent to them?

I am skeptical of laws that purport to overrule consent in the context of privacy, since lack of consent is so integral to a privacy harm. This isn’t an area where practices are illegal per se, such as selling human organs. If I want to share even my genetic data with researchers or the public, who’s to say I shouldn’t be allowed to? After all, people post the most private aspects of their lives on social media. Assuming consent is real, there is no privacy violation. Outlawing a data practice even with real consumer consent is better left to Congress.

The risk in exploring such novel domains is that with judicial review being part of the Section 18 process, the FTC could end up ceding not only such new territory but also territory it currently occupies safely as part of its “common law.” As discussed above, through its settlement practice, the agency has so far strategically shielded its common law from judicial review.

The FTC’s attempt at rulemaking is a welcome initiative that could result in stronger protections for consumers and better clarity for businesses. To optimize the process and set it up for success, the agency should calibrate its lens to focus on areas ripe for codification. Beyond that, Congress should have the last word.

This article first appeared August 17, 2022, on the IAPP Privacy Perspectives page.

The post The FTC’s Privacy Rulemaking: Risks and Opportunities appeared first on Data + Privacy + Cybersecurity Insights.