Dark Patterns: What They are & What Not to Do

Last updated on 23 August 2021 by William Blesch (Legal and data protection research writer at TermsFeed)

Dark Patterns: What They are & What Not to Do

Many privacy laws now in effect worldwide require businesses to acquire explicit consent before collecting their customers' private, personal information. However, the compliance rates are abysmally low, according to research published by Cornell University.

In fact, only about 11 percent of all websites are putting together consent notices that comply with even the minimum required by law. Because businesses are trying to find loopholes everywhere they can, legislators are focusing more on the effect of "dark patterns" in technical design on commerce, privacy, data protection, and user choice.

The U.S. Federal Trade Commissioner Rohit Chopra has defined these so-called dark patterns as "design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent."

For example, app and website design have certain conventions with which most people are familiar. Visual cues like using an "X" icon let users know that clicking it will close out programs and documents. A solid red circle around an "X" is often used to denote a warning, and so on.

However, what happens if a company purposefully manipulates their customers by creating larger buttons for the things they want customers to agree to? What if the company purposely makes customers browse through pages and pages of content before allowing them to arrive at the information they sought initially?

What if the company claimed that some information was free, but to receive it, customers had to agree to allow that business to sell their personal data to third parties? But! While the company technically kept the law by disclosing their data collection practices in a Privacy Policy, the link to that document was buried in a sea of text, and that link didn't stand out in any way.

This kind of deceptive, manipulative behavior in web and technical design is what legislators are talking about when addressing unethical "dark patterns." Increasingly, lawmakers are seeking to tackle the problem.

Three laws currently have regulations that attempt to curtail dark patterns. They are:

  • Europe's the General Data Protection Regulation (GDPR)
  • California's Consumer Privacy Act (CCPA)
  • California's Privacy Rights Act (CPRA)

What are Dark Patterns?

What are Dark Patterns?

The term "dark patterns" entered the English language in 2010 when Harry Brignull, a user experience researcher in the UK, coined it to describe deceptive practices used to manipulate a company's customers.

The questionable things Brignull noticed included user interfaces that had been made specifically to deceive people into doing things, like buying insurance with their purchase or "signing up for recurring bills."

Of course, none of those practices are exactly new. Many online businesses have employed them practically since the Internet began. Unethical, scammy business people have used them in other iterations almost since humans invented commerce.

"Dark patterns" is a broad term, and it encompasses far more than misleading and manipulative visual cues. Take the following, for example. It's a dark pattern tactic when an app or website:

  • Downplays or omits crucial information
  • Hides the full price of a service or product
  • Uses deceptive or confusing language

Prevalence of Dark Patterns

An international study published in 2019 discovered that out of 5,000 privacy notifications sent out by a slew of companies across Europe, more than half used dark patterns. Moreover, only 4.2 percent gave their customers a choice regarding consent or delivered more than simple confirmation that a customer's data would be collected.

While there is still no broad consensus on the definition of "dark patterns" from a legal perspective, lawmakers are still trying to address them in privacy legislation. For example, the CCPA declares that dark patterns are a user interface designed or manipulated with the substantial effect of impairing or subverting "user autonomy, decision-making, or choice."

Why are Dark Patterns Powerful?

Human psychology is what makes dark patterns so powerful. They play upon cognitive biases that most people aren't aware they even have. A cognitive bias is a "weakness" or "flaw," which can lead an individual into making irrational, poor decisions.

As the independent French administrative regulatory and data protection authority, CNIL declared that we are influenced and trained to always share more, without always recognizing, ultimately, "that we are jeopardizing our rights and freedoms."

Examples of Deceptive Practices

As previously noted, some companies have no problem employing unethical and deceptive design practices to get their customers to buy more and give up more user data.

These kinds of things can happen when:

  • An app requires users to share contacts
  • An app requires social media account information before it will function
  • An app requires a user's physical location before functioning
  • A website user is forced to sign up to a newsletter before moving on to a service
  • A search engine employes default settings that track and monitor a user's input without letting the user know it does so

Dark Pattern Categories

Dark Pattern Categories

As you have probably surmised by now, dark patterns can take many forms, and not all of them are easy to spot or recognize. However, researchers have attempted to categorize and list them out.

Forbrukerr├ądet, a Norwegian Consumer Council, identified five categories of dark patterns and compiled them into a notable report called "Deceived by Design." These categories are as follows.

Default Settings

Most digital products come with privacy options built in. With that said, most users never even touch them. In fact, only about five percent of all people attempt to change the basic, manufacturer-installed settings at all.

The creepy thing about this fact is that the default privacy settings of a digital product, device, etc. can be unbelievably powerful.

For instance, a study about organ donation found the following:

In general, there are two kinds of policies regarding consent collection. There is explicit opt-in consent, and there is presumed consent. The former requires that an individual physically check a box on a paper or digital form before becoming a donor. The latter assumes that an individual has agreed to organ donation by default.

In countries that use presumed consent, a lot more people are organ donors.

If you want to talk about purely digital settings, consider that many digital service providers take advantage of them. They know the statistics about how many people usually change settings. Therefore these service providers purposefully create defaults that collect as much information as possible. The service provider then profits without the user being any the wiser.

Moreover, even if the service provider is transparent about its practices, it will often use large "accept" buttons, etc., to get users to agree and move on without pausing to read what they're actually agreeing to.

Ease of Choice

Real choices can be difficult to come by. Some companies create the illusion that the customer has a choice, but it's not a real one.

For instance, there are ways that companies can push or incentivize their customers to take specific actions. They do so by making certain options more visually appealing or visible, making buttons or links more appealing or visible, and giving users only one real choice when accepting Terms and Condition agreements or Privacy Policies.

Negative vs. Positive Wording

If a company is able to determine a customer's intent, it can then present a choice in a way that manipulates that customer. For instance, the company can create wording that sounds great and aligns with the customer's intent. However, the company will purposefully leave out vital information.

An example of a terrible actor when it comes to this type of behavior is Facebook. When the company launched its facial recognition feature, many people were worried about its potential to violate privacy in the extreme. Facebook turned off the feature for citizens of the EU.

Later, the company turned it back on but with GDPR popups. Facebook claimed that its purpose in implementing facial recognition software was to help the visually impaired, all while it was busy collecting the biometric data of users all over the place.

Facebook tried to downplay the negatives by reframing the launch of their facial recognition features as something positive while leaving out crucial information. By glossing over the negatives, the company was essentially attempting to monetize user data without explicit permission.

The company's actions led to a class-action lawsuit filed in Illinois. In 2021 a judge penalized Facebook to the tune of $650 million.

The Carrot and the Stick

Another manipulative practice is rewarding customers for making a decision the company wants them to make and penalizing them for refusing.

For example, rewards can take the form of discounts, making certain items or services free, or adding on additional features to an app.

On the other hand, punishments for not doing what the company wants might be something like refusing to allow customers to browse the company's website until they accept the use of "all cookies," including those that are used to track the customer for marketing purposes.

Forced Action and Timing

Some companies attempt to get customers to make certain actions by limiting their time to complete the action. It's an inherently coercive practice. Another way companies compel customers to make a certain choice is to force them through a series of actions if they want to receive something specific.

That series of actions could be allowing the company to share private, personal data or it could be agreeing to the company's Terms and Conditions, etc.

Dark Patterns and the GDPR

Dark Patterns and the GDPR

While the GDPR is considered the gold standard for privacy laws worldwide, dark patterns oppose the basic concept of consent as conceived by the law.

For example, the GDPR's definition of consent is:

"any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."

The law also makes clear that "Silence, pre-ticked boxes or inactivity should not, therefore, constitute consent."

Yet, as previously stated, studies have shown that a vast majority of companies essentially thumb their noses at the law by engaging in practices that don't give customers any real choices at all.

The most typical dark pattern practices that violate the GDPR's consent rules are the following:

  • Requiring customers to go through more effort to reject data collection and processing than to agree to it
  • Pre-selecting choices on behalf of the customer
  • Providing customers with notices and consent forms that don't have an option to "reject all"

Dark Patterns and the CCPA

Dark Patterns and the CCPA

California residents have the right to prohibit companies from selling their data. Moreover, companies that do business in the state must be transparent regarding what personal data is processed, why it's processed, and how it's processed.

However, in 2020 a group of researchers from Stanford University analyzed CCPA notices connected with Do Not Sell requirements that companies were using. The researchers discovered that there were many cases wherein companies used dark patterns.

For example, customers who did not wish to have their data sold were led from a link on the company's home page to a Privacy Policy. The customer then had to go through the whole policy before arriving at the Do Not Sell form. Customers then had to submit the form by clicking a specific button.

Throughout the entire process, there were no instructions to let customers know exactly what they needed to do to exercise rights over their own data. Plus, many of these companies asked irrelevant, intrusive questions on the Do Not Sell form itself.

In other words, these companies were still attempting to collect user data even when the customer was trying to prohibit them from doing so.

Dark Patterns and the CPRA

Dark Patterns and the CPRA

While the CPRA mentions dark patterns, it ultimately makes the legislators that wrote it look more than confused. As written in the law, dark patterns are:

"a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation."

Ah. No helpful guidance there. Indeed, the law had language added to it that sought to clarify, but that did not. Now, the CPRA just states that "agreement obtained through the use of dark patterns does not constitute consent."

Thus, the entire idea of dark patterns is outrageously vague to the point of irrelevance when it comes to the CPRA.

Business owners would do better to pay attention to guidance from the GDPR and the CCPA and do their best to comply with the regulations of those laws.

Summary

While the world attempts to implement more robust data privacy laws, many companies are trying to circumvent them.

Many companies employ wildly deceptive and unethical methods to manipulate their customers into taking specific actions. Many of these actions lead to capturing and processing an individual's private, personal information that the company then seeks to monetize.

With the above in mind, ethical business owners can seek to build trust with their customer base by adhering as closely as possible to data privacy laws such as the GDPR and the CCPA.

William Blesch

William Blesch

Legal and data protection research writer at TermsFeed

This article is not a substitute for professional legal advice. This article does not create an attorney-client relationship, nor is it a solicitation to offer legal advice.