Dark Patterns – did they just trick you with a “don’t not opt out”

//Aprajita Tyagi// We have all heard about the TurboTax fiasco1 where the company hid the U.S. government-mandated free tax-file program for low-income users. They hid this option on their website to get these users to use their paid program. But what was the underlying issue? Why the backlash? Another commonly encountered situation is that ticking timer telling you that XYZ offer is available only for the next 15 minutes but the offer remains valid even after the expiry of those 15 minutes. Did you feel cheated when you were pushed to make the decision as per a timer which didn’t really expire? Why/Why not? Afterall, these companies are running a business and making money is kind of the end goal there. A recent example in this space is a multitude of privacy related issues being highlighted with the app Clubhouse (including accessing user contacts, no option to permanently turn-off notifications but only pause them etc.). What is going on here?

Dark Patterns

The above tricks/tactics are examples of Dark Patterns, a term coined by design researcher Harry Brignull in 2010 for designs that trick users into taking actions by using behavioral psychology techniques. Dark patterns are ethically problematic since they misdirect the customers into making choices that are not in their best interest.2

Dark patterns can take various forms including use of confusing language like double negations, not disclosing the true price of a product or service or adding items to your cart automatically, omitting or downplaying important information.3

How do Dark Patterns work?

As explained by darkpatterns.org, “when you use websites and apps, you don’t read every word on every page – you skim read and make assumptions. If a company wants to trick you into doing something, they can take advantage of this by making a page look like it is saying one thing when it is in fact saying another.” Dark patterns are problematic because of the huge power imbalances and

information asymmetries that exist between the two sides of digital transactions i.e., the service providers and their users. For example, the information asymmetry in many digital services becomes particularly large because most users cannot accurately ascertain the risks of exposing their privacy. If a user is asked to trade their personal data for a short term financial benefit, such as a discount, the actual cost of the trade-off is difficult to grasp. In this case, the short-term gain (discount) is tangible and immediate, while the potential loss (privacy) long term.4

What’s a ‘Nudge’?

“A nudge . . . is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not” (Thaler and Sunstein, 2009, s. 6). Digital nudging is a subtle form of using design, information, and interaction elements to guide user behavior in digital environments, without restricting the individual’s freedom of choice (Meske and Potthoff, 2017). To put it simply, rather than making decisions based on rationality, individuals have a tendency to be influenced by a variety of cognitive biases, often without being aware of it.5 For example, individuals have a tendency to choose smaller short-term rewards, rather than larger long-term gains (hyperbolic discounting), and prefer choices and information that confirm to their pre-existing beliefs (confirmation bias). Interface designers who are aware of these biases can use this knowledge to effectively nudge users into making particular choices.6 Deliberately misleading users through exploitative nudging is also called “dark patterns”.7

Categories of dark patterns

  • Default settings (Privacy by default)- default settings are often sticky, so they should be chosen carefully and responsibly. Research has shown that most users will never look at, let alone change, the default settings.8

  • Ease (Making the privacy option more cumbersome)- If the aim is to lead users in a certain direction, making the process toward the alternatives a long and arduous process can be an effective dark pattern.9

  • Framing (Positive and negative wording)- In order to nudge users toward making certain choices, the way that the different options are framed is an effective motivating factor. Focusing on the positive aspects of one choice, while glossing over any potentially negative aspects, will incline many users to comply with the service provider’s wishes. This is another example of a dark pattern.10

  • Rewards and punishment- In order to entice users to make certain choices, a common nudging strategy is to use incentives to reward the “correct” choice, and punish choices that the service provider deems undesirable.11 The reward could be extra functionality or a better service, while the punishment might be the opposite. This is particularly problematic if the reward and punishment is not directly related to the choice that is being presented.12

  • Forced action and timing- Consumers are often using digital services on their phones while on the go. Forcing users to choose between actions on the spot is therefore a particularly strong nudge.13

Types of dark patterns

Below are some examples of dark patterns cataloged by Harry Brignull:14

  • Bait and switch: you set out to do one thing, but a different, undesirable thing happens instead.

  • Confirmshaming: confirmshaming is the act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.

  • Disguised ads: adverts that are disguised as other kinds of content or navigation, in order to get you to click on them.

  • Forced continuity: when your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. In some cases, this is made even worse by making it difficult to cancel the membership.

  • Friend spam: the product asks for your email or social media permissions under the pretense it will be used for a desirable outcome (e.g., finding friends), but then spams all your contacts in a message that claims to be from you.

  • Hidden costs: you get to the last step of the checkout process, only to discover some unexpected

charges have appeared, e.g., delivery charges, tax, etc.

  • Misdirection: the design purposefully focuses your attention on one thing in order to distract your attention from another.

  • Price comparison prevention: the retailer makes it hard for you to compare the price of an item with another item, so you cannot make an informed decision.

  • Privacy Zuckering: you are tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook CEO Mark Zuckerberg.

  • Roach motel: the design makes it very easy for you to get into a certain situation but then makes it hard for you to get out of it (e.g., a subscription).

  • Sneak into basket: you attempt to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt out radio button or checkbox on a prior page.

  • Trick questions: you respond to a question, which, when glanced upon quickly appears to ask one

thing, but if read carefully, asks another thing entirely.

Dark Patterns and the GDPR

Under the European General Data Protection Regulation (GDPR), processing personal data requires legal grounds and fulfilment of the data protection principles. One of the principles is the principle of purpose

limitation, which entails that personal data should be collected for a clear purpose, and should not be used for other incompatible purposes, and must be deleted when it is no longer necessary to process personal data for these purposes. Another important principle is the principle of data minimisation, which states that organisations should collect the minimum amount of personal data necessary to perform a task. Furthermore, the principle of transparency means that individuals should receive an explanation, in a clear and understandable manner, of what personal data is collected, and for what purposes.15 The GDPR also requires that services should be developed according to the principles of data protection by design and data protection by default.16 Data protection by design means that services should be designed to ensure that data minimisation, purpose limitation, and transparency are safeguarded.17 In addition to limiting the data collected, appropriate and effective measures to ensure the integrity and confidentiality of the data should also be implemented. Data protection by default requires that consumers should receive a high level of data protection, even if they do not actively opt out of the collection and processing of personal data.18 Personal data must be processed lawfully19 (process personal data in order to fulfil a contract with the user or when processing is necessary for the purpose of their legitimate interest and provided it does not prejudice the rights and interests of individuals20). If personal data is processed for other purposes, consent from the data subject is necessary. Further, the GDPR, Article 4(11) defines ‘consent’ of the data subject as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”

In view of the above, any deployment of dark patterns through which a blanket user consent is obtained, or nudging users toward making certain choices, is contrary to the GDPR and are techniques to circumvent the free consent requirements. The use of dark patterns to lead users toward less privacy-friendly options can also contravene the principle of data protection by default and design.21

Dark Patterns, CCPA, and CPRA

The California Consumer Privacy Act (CCPA) requires businesses to reveal the what, why, and how of processing users’ personal information. It also allows California residents to forbid companies to sell their data. One of the key tenets of the CCPA is the need for transparency and meaningful consumer consent. On March 15, 2021, California Attorney General Xavier Becerra announced the approval of modified regulations under the CCPA that are effective immediately which, amongst other things, bans the following dark patterns: using an opt-out request process that requires more steps than the process for a consumer to opt back into the sale of personal information after previously opting out; using confusing language (like double-negatives, “Don’t Not Sell My Personal Information”); requiring consumers to click through or listen to unnecessary reasons why they should not submit a request to opt-out before confirming their request; requiring a consumer to provide personal information that is unnecessary to implement an opt-out request; or upon clicking the “Do Not Sell My Personal Information” link, requiring a consumer to search or scroll through the text of a website or privacy policy to submit the opt-out request.22

In addition, the California Privacy Rights Act (CPRA) ballot initiative, which recently passed, specifically addresses the use of “dark patterns.” The CPRA defines the term as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” Section 1798.140(h) of the CPRA further provides that consent received through the use of dark patterns is not considered valid consent, while section 1798.185(a)(20) tasks the California attorney general with ensuring that a business’ opt-out link does not employ any dark patterns.23

Conclusion

Dark patterns have been around for decades and the digital economy has made them a matter of mainstream awareness. Recent surge in legislative attention and public backlash faced by companies deploying dark patterns, should caution companies to use ethical practices while designing customer/user interfaces.

Aprajita Tyagi Head of Legal at SpeedLegal  (San Francisco, USA)| Berkeley law (Dean’s list)| CIPP/US| Tech, Commercial, and Privacy Counsel

1 Elliott, J. and Waldron, L. Here’s how TurboTax just tricked you into paying to file your taxes. ProPublica (April 22, 2019); https://www.propublica.org/article/turbotax-just-tricked-you-into-paying-to-file-your-taxes.

2 “Dark Patterns and the Ethics of Design” (Nov 29, 2017); https://medium.com/adventures-in-uxdesign/dark-patterns-and-the-ethics-of-design-31853436176b

3 Karolina Matuszewska, When design goes awry – How dark patterns conflict with GDPR and CCPA (December 3, 2020); https://piwik.pro/blog/how-dark-patterns-conflict-with-gdpr-ccpa/

6 Supra note 4, at 7

7 The term “dark patterns” was coined by user experience researcher Harry Brignull, https://darkpatterns.org/

9 Supra note 4, at 19

10 Supra note 4, at 22

11 “Nudges for Privacy and Security: Understanding and Assisting Users’ Choices Online”, page 22 https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2859227

12 For example, it makes sense that users who do not consent to the use of location data will not be able to see their current location in a map service. If users are blocked from accessing the map at all, however, this is not proportional nor directly related to the choice.

13 Supra note 4, at 27

15 GDPR Article 5

16GDPR Article 25

17 The principles of data protection by design and default are used to designate the obligations placed on organisations under the GDPR. Privacy by design and default are broader concepts, encompassing an ethical dimension consistent with the right to privacy enshrined in the EU Charter of Fundamental Rights. See EDPS “Preliminary opinion on privacy by design”, page 1

https://edps.europa.eu/sites/edp/files/publication/18-05-31_preliminary_opinion_on_privacy_by_design_en_0.pdf

18“EDPS calls for workable technology which serves the interests of society” https://edps.europa.eu/press-publications/press-news/press-releases/2018/edpscalls-workable-technology-which-serves_en

19 GDPR Article 6

20The reliance on legitimate interest, particularly for advertising and profiling purposes, is controversial. See “Why the GDPR ‘legitimate interest’ provision will not save you” https://pagefair.com/blog/2017/gdpr-legitimate-interest/

21 Supra note 2, at 10

22 Vinson & Elkins LLP, Modifications to CCPA Regulations Prohibit “Dark Patterns” (Mar 19, 2021) https://www.jdsupra.com/legalnews/modifications-to-ccpa-regulations-3703389/

23 Sean Kellogg, How US, EU approach regulating ‘dark patterns’ (Dec 01, 2020) https://iapp.org/news/a/ongoing-dark-pattern-regulation/