Securiti launches Gencore AI, a holistic solution to build Safe Enterprise AI with proprietary data - easily

View

An Overview of OECD’s Report Highlighting Concerns Over Dark Patterns

Author

Omer Imran Malik

Senior Data Privacy Consultant at Securiti

FIP, CIPT, CIPM, CIPP/US

Listen to the content

In today’s digital world, there is a growing fear that unethical business practices in online environments, such as the use of dark patterns, could cause significant harm to consumers. Dark patterns are ordinarily employed in online user interfaces and lead, trick, pressure, or otherwise influence customers into making decisions that are frequently not in their best interests.

The Organization for Economic Co-operation and Development’s (OECD) Committee on Consumer Policy (CCP) first conducted a roundtable conference in November 2020 because of the growing necessity to examine dark commercial patterns thoroughly.

Following the event, the OECD published a report which builds on the roundtable discussion, offers a working definition of dark commercial patterns, provides evidence of their prevalence and harms, and identifies potential regulatory and enforcement measures to help policymakers and authorities address them. The report also documents the possible approaches businesses could adopt to mitigate dark patterns.

What are Dark Commercial Patterns?

The phrase "dark (commercial) patterns" describes a wide range of methods frequently used in online user interfaces to influence customers to make decisions that may not be in their best interests, notably by capitalizing on their prejudices. According to the OECD, numerous e-commerce websites, apps, browsers, online games, and cookie consent messages exhibit dark patterns.

Even though the phrase ‘dark patterns’ is relatively new, many of the practices it refers to have long been used by corporations and marketers. Such practices have been the subject of scrutiny by behavioral scientists and legal experts, and have been made punishable by law in many jurisdictions.

Typically, dark patterns aim to manipulate customers to part with more money, personal information, or attention time than they would like. Therefore, dark patterns are intrinsically linked to business models even if user interface designers do not have any malicious intent. Dark patterns may involve the use of artificial intelligence technologies to exploit consumers' biases in a business’ favor.

To ease discussions among regulators and decision-makers across jurisdictions, the OECD Committee on Consumer Policy offers a working definition of dark patterns, which is reproduced below.

“Dark commercial patterns are business practices employing elements of digital choice architecture, particularly in online user interfaces, that subvert or impair consumer autonomy, decision-making or choice. They often deceive, coerce or manipulate consumers and are likely to cause direct or indirect consumer detriment in various ways, though it may be difficult or impossible to measure such detriment in many instances.”

Dark patterns vary across a range of dimensions and come in different forms and designs. They may employ different design-based elements (e.g., use of single or multiple screens; pop-up dialogue boxes or embedded text; variations in coloring and prominence of options, etc.) and text-based elements (e.g., use of emotive or aggressive language).

Categories of Dark Patterns

Dark patterns tend to manipulate consumers’ behavior by relying on cognitive and behavioral biases and heuristics, which primarily include:

  • scarcity heuristic (tendency to place a higher value on scarce options);
  • social proof bias (tendency to make choices that conform with those of others);
  • default bias (tendency to remain with the status quo or default option);
  • sunk-cost fallacy (tendency to persist with a choice on account of the resources invested in it); and
  • framing effects (tendency to make different decisions on the basis of the same information based on how it is presented).

Typically, dark patterns fit into one of the following groups:

Forced Action

Forced-action dark patterns try to compel the user to take an action in order to access a particular functionality.

Example: use of cookie walls (that is, forcing users to accept cookies in order to access a digital platform), or extraction and use of information about the consumer’s contacts (possibly without their consent) in order to grant access to a service.

Interface Interference

Through the structuring of information, dark patterns comprising interface interference seek to prioritize certain consumer activities, which are in favor of the online business.

Example: visually obscuring important information, giving visual prominence to or pre-selecting options favorable to the business (e.g., giving more prominence to the ‘Accept’ button on a cookie consent banner or pre-selecting the ‘Accept’ option on the banner), using intentional or obvious ambiguity with trick questions (e.g., using double negatives, or other confusing terms such as “Ok”, “I understand”, “Dismiss” or “Close” on cookie consent banners); or manipulating the consumer toward a particular choice through emotive language or framing.

Nagging

Dark patterns under this category constitute constant requests to customers to undertake an action favorable to the business. Such patterns take advantage of the customer's limited time or self-control.

Example: repeated requests to consumers to turn on notifications or location-tracking.

Obstruction

To discourage an action, obstruction-related dark patterns try to make a task flow or interaction more challenging than it may inherently need to be.

Example: making it hard to cancel a service, withdraw consent, or opt-out to more privacy-friendly settings, making it hard or impossible to delete an account or consumer information, or creating different lengths of click paths to different options in order to steer consumers to choose the “simple” path preferred by the business.

Sneaking

Sneaky dark patterns aim to conceal, obfuscate, or postpone the disclosure of information relevant to the consumer's decision.

Example: adding non-optional charges to a transaction at its final stage, ‘sneaking an item’ into a consumer’s cart without consent, e.g., through a checkbox on a prior page, automatically renewing a purchase without the consumer’s explicit consent, or providing a consumer with unsolicited goods or services.

Social Proof

Social proof bias can be exploited through dark patterns that try to manipulate consumers into making a choice based on observations of other customers' behavior.

Example: notifications of other consumers’ purchasing activities.

Urgency

Dark patterns that involve urgency place a real or false time or quantity limit on a deal to pressurize customers to make a purchase, making use of the scarcity heuristic.

Example: countdown timer indicating the expiry of a deal or discount or time pressure designed to dupe users into consenting.

It’s important to note here that with the advent of new technologies and new kinds of user interfaces, new forms of dark patterns are constantly emerging. Thus, the foregoing is not an exhaustive list of categories of dark patterns.

Coverage of Dark Patterns in Existing or Proposed Legislation

The usage of dark patterns ordinarily falls within the scope of consumer protection and data protection laws. Consumer protection laws prohibit misleading, deceptive, or unfair practices associated with many dark patterns, whereas data protection laws require transactions to be transparent, with consumers’ consent being an unambiguous indication of the consumer’s wishes.

The following are a few significant existing or proposed legislations that cater to the usage of dark patterns by businesses.

California Privacy Rights Act (CPRA)

The CPRA defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation,” and specifies that consumer consent obtained through dark patterns is not legally valid.

EU General Data Protection Regulation (GDPR)

The GDPR does not explicitly deal with the usage of dark patterns. However, certain provisions of the GDPR can guide websites in developing user interfaces in compliance with the provisions of the GDPR.

In this respect, the European Data Protection Board states that data processing principles specified in Article 5 of the GDPR, particularly transparency, data minimization, purpose limitation, and accountability, serve as a starting point in assessing whether a design pattern actually constitutes a “dark pattern”.

Such assessment is also based on conditions of consent under Articles 4 (11) and 7 of the GDPR, that is, whether a digital platform allows consumers to give voluntary, specific, informed, and unambiguous consent and also facilitates consent withdrawal by making it as easy as giving consent. The usage of dark patterns by businesses to manipulate and deceive users into giving consent renders such consent invalid.

The requirements relating to the transparent provision of information to data subjects under Article 12 of the GDPR, and other data subject rights, also constitute an important aspect of determining whether a user interface is legally compliant or tries to manipulate users into making certain decisions in favor of a business through dark patterns. Moreover, the requirements of data protection by design and default under Article 25 of the GDPR also obligate online platforms to design choice architecture in a manner that encourages users to make free and informed choices in relation to their personal data.

It is important to note that the French data protection authority (CNIL) has developed practical examples of user interfaces aimed at helping designers comply with the GDPR. Similarly, the EDPB has released guidelines that provide practical recommendations and best practices to designers and users of social media platforms on assessing and avoiding dark patterns in digital platforms that infringe GDPR requirements.

EU Digital Services Act (DSA)

The DSA is an EU legislation adopted in 2022 that will impose additional requirements on online platforms. The DSA prohibits digital platforms from designing or operating such online interfaces that purposefully or otherwise materially distort or impair the ability of the recipients of the service to make autonomous and informed choices. Further, the DSA provides that the European Commission may issue guidance on how the foregoing prohibition applies to specific dark patterns.

EU Digital Markets Act (DMA)

The DMA is an EU legislation passed in 2022 that imposes new requirements on huge online platforms (also known as "gatekeepers"). It is prohibited for gatekeepers to provide choices to end users or business users in a non-neutral way or to undermine their autonomy, decision-making, or freedom of choice through the structure, design, function, or mode of operation of a user interface or a portion thereof.

US Deceptive Experiences To Online Users Reduction (DETOUR) Act

The DETOUR Act, initially introduced in 2019 but not passed as of yet, is the first piece of federal legislation in the US that aims to explicitly prohibit dark patterns on online platforms. Under the DETOUR Act, it would be unlawful for any large online platform “to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”

Tools to Help Businesses Avoid the Use of Dark Patterns

In addition to complying with legal and regulatory requirements, businesses may rely on a number of tools to enhance their understanding of dark commercial patterns and avoid their usage.

  • Many advertising and marketing self- or co-regulatory initiatives have issued principles and guidelines related to ethical online business practices, including fair marketing practices and the protection of the personal data of consumers.
  • Corporate Digital Responsibility initiatives around the globe also provide guidance on the ethical use of digital technology and recommend measures that may help businesses be transparent and respect the freedom of choice of consumers.
  • Businesses may also rely on codes of conduct or ethics developed by the user interface/user experience design community to avoid dark patterns and design ethical user interfaces.
  • Businesses should adopt self-regulatory mechanisms, such as internal audits, to review their business processes and design choices for user interfaces so that they may identify and mitigate dark patterns.
  • Businesses may also conduct regular reviews of unnecessary friction in consumer decision-making processes to evaluate their algorithms.
  • Businesses are further recommended to develop standardized transparency impact assessment processes for interface designs and conduct A/B testing to ensure their compliance with legal and regulatory requirements and make their platforms more user-friendly.

How Securiti Can Help

It is essential for businesses to avoid the use of dark patterns in the face of increasing restrictions placed by consumer protection and data protection laws around the globe. In instances of non-compliance, businesses run the risk of incurring severe monetary penalties and other enforcement actions. In this regard, Securiti, with its multitude of data compliance and governance-related enterprise solutions, can help businesses remain legally compliant.

Specifically, Securiti’s consent management platform allows organizations to develop user-friendly interfaces which enable consumers to provide informed, specific, voluntary, and unambiguous consent regarding their choices in the digital sphere.

Request a demo today and learn more about how Securiti can help you comply with evolving consumer protection and data protection laws.

Join Our Newsletter

Get all the latest information, law updates and more delivered to your inbox


Share


More Stories that May Interest You

What's
New