Tech companies, subscription apps and e-commerce sites have for years used subtle tricks to nudge people toward a decision or purchase they might not otherwise make. There’s even a name for the tactics: dark patterns.
Now, a crackdown may be coming.
Members of Congress, consumer protection agencies, nonprofit watchdog groups and academic researchers have all announced plans this year to increase the scrutiny they give to dark patterns, laying the groundwork for possible legal action and promising to bring more clarity and fairness to how people navigate the internet.
The tactics in question are likely familiar to anyone who’s used a web browser or a smartphone: the extra hoops to jump through to cancel a subscription, a poorly worded question that leads to an avalanche of unwanted emails or a sign-up process that requires extra work to turn on privacy features.
“They’re trying to use emotional blackmail or coercive techniques to try to get you to do something,” said Harry Brignull, a London-based expert in designing online user experiences who coined the term “dark patterns” in 2010.
There’s often money at stake, and the financial costs have been piling up as more online services embrace subscription-based models. Last year, the re-election campaign of then-President Donald Trump was accused of duping people into weekly recurring payments.
In other cases, heavy-handed design tactics may have intangible costs such as privacy, time or sanity.
Last month, the Federal Trade Commission issued a warning that it’s going to take a closer look, especially at those patterns that “trick or trap” consumers into subscription services. The commission and state agencies have brought many cases against subscription services or other businesses in the past, but the FTC said enforcement hasn’t gone far enough.
“The number of ongoing cases and high volume of complaints demonstrate there is prevalent, unabated consumer harm in the marketplace,” the FTC said in a 15-page statement. It said it receives thousands of complaints a year about tactics such as automatic renewals.
Many cases of alleged dark patterns already get scrutiny from other sources, including in class-action lawsuits in which the phrase “dark patterns” has come up recently in cases of disputed refunds or recurring subscriptions. In 2015, LinkedIn agreed to pay $13 million to settle a class-action lawsuit over the design of its sign-up process.
The FTC statement amounted to a warning shot for companies that may be engaging in similar activity now. The commission has been studying the issue for months and its new chair, Lina Khan, who was appointed by President Joe Biden, is a persistent critic of tech companies.
Brignull, who has a Twitter account to catalogue examples of dark patterns, said he thinks regulatory action is necessary because the problem won’t fix itself. Even if app or website designers pledge to be more ethical, they generally don’t have final say at a company, he said.
“Having a call to ethics or self-regulation doesn’t work,” he said. “Organizations need to know where the guide rails are. Otherwise, they’ll just look at other, successful companies and say, ‘That’s what we need to do.'”
“Organizations need to know where the guide rails are. Otherwise, they’ll just look at other, successful companies and say, ‘That’s what we need to do.'”
Harry Brignull, user experience designer
One big question facing the FTC and other enforcement agencies: Where should they draw the line between a harmless steer and a manipulative practice?
Part of the answer may be money lost by consumers, as the FTC has focused on costly subscriptions and less on other forms of harm. But the commission also laid out rules that companies should generally follow if they want to stay on the right side of the law and regulations. Among them: Marketers must obtain consumers’ “express informed consent” for subscriptions, including those that start off as free and convert to paid plans.
The law firm Fenwick & West said in a note to clients after the FTC statement that businesses should review their existing websites and apps to ensure they’re compliant, and “re-engineer” where necessary.
Another federal agency, the Securities and Exchange Commission, is looking at a related issue in app design: the “gamification” of investment trading through design elements. In August, it asked for public comment on whether the design choices encourage investors to trade more often, invest in different products or change their investment strategy.
Rep. Sean Casten, D-Ill., who has been calling for further government study of stock trading gamification, had earlier called on the SEC to look at trading apps such as Robinhood.
“Online trading platforms like Robinhood make money by using the same psychological nudges that Silicon Valley originally developed to get us addicted to games like Candy Crush and Farmville,” he said in a statement.
Robinhood responded to the SEC with a 37-page public comment defending its practices.
“It’s important not to conflate gamification with simple, intuitive design,” Aparna Chennapragada, Robinhood’s chief product officer, said in a separate statement. “Our app has made investing approachable and more accessible, helping to break down barriers for a whole new generation of investors. We’ll continue ensuring our customers have a great user experience and look forward to engaging with the SEC on this issue.”
But the concern about potential dark patterns is broader than costly subscriptions or stock trading.
In California, government attorneys could use a sweeping state privacy law that took effect last year to go after apps or websites with a manipulative design, according to regulations finalized by the state attorney general’s office in March.
The regulations have to do with opting out of the sale of personal information and say that the process to opt out “shall require minimal steps.” A website also can’t use confusing language such as double negatives (“Don’t Not Sell My Personal Information”).
The California Attorney General’s Office said in a statement that it was actively monitoring compliance with the privacy act but that it had no public enforcement actions to share.
Lucy Bernholz, director of the Digital Civil Society Lab at Stanford University, said that part of the reason there have been limited solutions to dark patterns is there hasn’t been enough research. But that, too, is changing.
In May, a coalition of nonprofit organizations and researchers founded the Dark Patterns Tip Line to gather examples from frustrated consumers who felt cornered into making unwanted choices. The backers include Consumer Reports and the Electronic Frontier Foundation, and the tip line will be housed at Stanford.
It got 700 submissions after its initial marketing push, Bernholz said, and although the tips won’t go to law enforcement or necessarily lead to immediate fixes for consumers, the line will provide data and examples for research and teaching.
Among the questions researchers might look at, she said, is whether different populations experience dark patterns with more frequency — for example, if online services that cater to low-income people are more likely to employ them.
“Are vulnerable populations targeted more often? Are these discriminatory?” she asked.
Change might come slowly, Bernholz said, but she was optimistic that it would come.
“The more we can draw attention to the prevalence of dark patterns in everyday life and the harms that they cause, we could actually begin to see a change in both the experiences people have when they sign up for something and then a change that the large providers have to make,” she said.