Securiti launches Gencore AI, a holistic solution to build Safe Enterprise AI with proprietary data - easily

View

An Overview of OECD’s Report Highlighting Concerns Over Dark Patterns

Published December 23, 2022
Author

Omer Imran Malik

Senior Data Privacy Consultant at Securiti

FIP, CIPT, CIPM, CIPP/US

Listen to the content

In today’s digital world, there is a growing fear that unethical business practices in online environments, such as the use of dark patterns, could cause significant harm to consumers. Dark patterns are ordinarily employed in online user interfaces and lead, trick, pressure, or otherwise influence customers into making decisions that are frequently not in their best interests.

The Organization for Economic Co-operation and Development’s (OECD) Committee on Consumer Policy (CCP) first conducted a roundtable conference in November 2020 because of the growing necessity to examine dark commercial patterns thoroughly.

Following the event, the OECD published a report which builds on the roundtable discussion, offers a working definition of dark commercial patterns, provides evidence of their prevalence and harms, and identifies potential regulatory and enforcement measures to help policymakers and authorities address them. The report also documents the possible approaches businesses could adopt to mitigate dark patterns.

What are Dark Commercial Patterns?

The phrase "dark (commercial) patterns" describes a wide range of methods frequently used in online user interfaces to influence customers to make decisions that may not be in their best interests, notably by capitalizing on their prejudices. According to the OECD, numerous e-commerce websites, apps, browsers, online games, and cookie consent messages exhibit dark patterns.

Even though the phrase ‘dark patterns’ is relatively new, many of the practices it refers to have long been used by corporations and marketers. Such practices have been the subject of scrutiny by behavioral scientists and legal experts, and have been made punishable by law in many jurisdictions.

Typically, dark patterns aim to manipulate customers to part with more money, personal information, or attention time than they would like. Therefore, dark patterns are intrinsically linked to business models even if user interface designers do not have any malicious intent. Dark patterns may involve the use of artificial intelligence technologies to exploit consumers' biases in a business’ favor.

To ease discussions among regulators and decision-makers across jurisdictions, the OECD Committee on Consumer Policy offers a working definition of dark patterns, which is reproduced below.

“Dark commercial patterns are business practices employing elements of digital choice architecture, particularly in online user interfaces, that subvert or impair consumer autonomy, decision-making or choice. They often deceive, coerce or manipulate consumers and are likely to cause direct or indirect consumer detriment in various ways, though it may be difficult or impossible to measure such detriment in many instances.”

Dark patterns vary across a range of dimensions and come in different forms and designs. They may employ different design-based elements (e.g., use of single or multiple screens; pop-up dialogue boxes or embedded text; variations in coloring and prominence of options, etc.) and text-based elements (e.g., use of emotive or aggressive language).

Categories of Dark Patterns

Dark patterns tend to manipulate consumers’ behavior by relying on cognitive and behavioral biases and heuristics, which primarily include:

  • scarcity heuristic (tendency to place a higher value on scarce options);
  • social proof bias (tendency to make choices that conform with those of others);
  • default bias (tendency to remain with the status quo or default option);
  • sunk-cost fallacy (tendency to persist with a choice on account of the resources invested in it); and
  • framing effects (tendency to make different decisions on the basis of the same information based on how it is presented).

Typically, dark patterns fit into one of the following groups:

Forced Action

Forced-action dark patterns try to compel the user to take an action in order to access a particular functionality.

Example: use of cookie walls (that is, forcing users to accept cookies in order to access a digital platform), or extraction and use of information about the consumer’s contacts (possibly without their consent) in order to grant access to a service.

Interface Interference

Through the structuring of information, dark patterns comprising interface interference seek to prioritize certain consumer activities, which are in favor of the online business.

Example: visually obscuring important information, giving visual prominence to or pre-selecting options favorable to the business (e.g., giving more prominence to the ‘Accept’ button on a cookie consent banner or pre-selecting the ‘Accept’ option on the banner), using intentional or obvious ambiguity with trick questions (e.g., using double negatives, or other confusing terms such as “Ok”, “I understand”, “Dismiss” or “Close” on cookie consent banners); or manipulating the consumer toward a particular choice through emotive language or framing.

Nagging

Dark patterns under this category constitute constant requests to customers to undertake an action favorable to the business. Such patterns take advantage of the customer's limited time or self-control.

Example: repeated requests to consumers to turn on notifications or location-tracking.

Obstruction

To discourage an action, obstruction-related dark patterns try to make a task flow or interaction more challenging than it may inherently need to be.

Example: making it hard to cancel a service, withdraw consent, or opt-out to more privacy-friendly settings, making it hard or impossible to delete an account or consumer information, or creating different lengths of click paths to different options in order to steer consumers to choose the “simple” path preferred by the business.

Sneaking

Sneaky dark patterns aim to conceal, obfuscate, or postpone the disclosure of information relevant to the consumer's decision.

Example: adding non-optional charges to a transaction at its final stage, ‘sneaking an item’ into a consumer’s cart without consent, e.g., through a checkbox on a prior page, automatically renewing a purchase without the consumer’s explicit consent, or providing a consumer with unsolicited goods or services.

Social Proof

Social proof bias can be exploited through dark patterns that try to manipulate consumers into making a choice based on observations of other customers' behavior.

Example: notifications of other consumers’ purchasing activities.

Urgency

Dark patterns that involve urgency place a real or false time or quantity limit on a deal to pressurize customers to make a purchase, making use of the scarcity heuristic.

Example: countdown timer indicating the expiry of a deal or discount or time pressure designed to dupe users into consenting.

It’s important to note here that with the advent of new technologies and new kinds of user interfaces, new forms of dark patterns are constantly emerging. Thus, the foregoing is not an exhaustive list of categories of dark patterns.

Coverage of Dark Patterns in Existing or Proposed Legislation

The usage of dark patterns ordinarily falls within the scope of consumer protection and data protection laws. Consumer protection laws prohibit misleading, deceptive, or unfair practices associated with many dark patterns, whereas data protection laws require transactions to be transparent, with consumers’ consent being an unambiguous indication of the consumer’s wishes.

The following are a few significant existing or proposed legislations that cater to the usage of dark patterns by businesses.

California Privacy Rights Act (CPRA)

The CPRA defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation,” and specifies that consumer consent obtained through dark patterns is not legally valid.

EU General Data Protection Regulation (GDPR)

The GDPR does not explicitly deal with the usage of dark patterns. However, certain provisions of the GDPR can guide websites in developing user interfaces in compliance with the provisions of the GDPR.

In this respect, the European Data Protection Board states that data processing principles specified in Article 5 of the GDPR, particularly transparency, data minimization, purpose limitation, and accountability, serve as a starting point in assessing whether a design pattern actually constitutes a “dark pattern”.

Such assessment is also based on conditions of consent under Articles 4 (11) and 7 of the GDPR, that is, whether a digital platform allows consumers to give voluntary, specific, informed, and unambiguous consent and also facilitates consent withdrawal by making it as easy as giving consent. The usage of dark patterns by businesses to manipulate and deceive users into giving consent renders such consent invalid.

The requirements relating to the transparent provision of information to data subjects under Article 12 of the GDPR, and other data subject rights, also constitute an important aspect of determining whether a user interface is legally compliant or tries to manipulate users into making certain decisions in favor of a business through dark patterns. Moreover, the requirements of data protection by design and default under Article 25 of the GDPR also obligate online platforms to design choice architecture in a manner that encourages users to make free and informed choices in relation to their personal data.

It is important to note that the French data protection authority (CNIL) has developed practical examples of user interfaces aimed at helping designers comply with the GDPR. Similarly, the EDPB has released guidelines that provide practical recommendations and best practices to designers and users of social media platforms on assessing and avoiding dark patterns in digital platforms that infringe GDPR requirements.

EU Digital Services Act (DSA)

The DSA is an EU legislation adopted in 2022 that will impose additional requirements on online platforms. The DSA prohibits digital platforms from designing or operating such online interfaces that purposefully or otherwise materially distort or impair the ability of the recipients of the service to make autonomous and informed choices. Further, the DSA provides that the European Commission may issue guidance on how the foregoing prohibition applies to specific dark patterns.

EU Digital Markets Act (DMA)

The DMA is an EU legislation passed in 2022 that imposes new requirements on huge online platforms (also known as "gatekeepers"). It is prohibited for gatekeepers to provide choices to end users or business users in a non-neutral way or to undermine their autonomy, decision-making, or freedom of choice through the structure, design, function, or mode of operation of a user interface or a portion thereof.

US Deceptive Experiences To Online Users Reduction (DETOUR) Act

The DETOUR Act, initially introduced in 2019 but not passed as of yet, is the first piece of federal legislation in the US that aims to explicitly prohibit dark patterns on online platforms. Under the DETOUR Act, it would be unlawful for any large online platform “to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”

Tools to Help Businesses Avoid the Use of Dark Patterns

In addition to complying with legal and regulatory requirements, businesses may rely on a number of tools to enhance their understanding of dark commercial patterns and avoid their usage.

  • Many advertising and marketing self- or co-regulatory initiatives have issued principles and guidelines related to ethical online business practices, including fair marketing practices and the protection of the personal data of consumers.
  • Corporate Digital Responsibility initiatives around the globe also provide guidance on the ethical use of digital technology and recommend measures that may help businesses be transparent and respect the freedom of choice of consumers.
  • Businesses may also rely on codes of conduct or ethics developed by the user interface/user experience design community to avoid dark patterns and design ethical user interfaces.
  • Businesses should adopt self-regulatory mechanisms, such as internal audits, to review their business processes and design choices for user interfaces so that they may identify and mitigate dark patterns.
  • Businesses may also conduct regular reviews of unnecessary friction in consumer decision-making processes to evaluate their algorithms.
  • Businesses are further recommended to develop standardized transparency impact assessment processes for interface designs and conduct A/B testing to ensure their compliance with legal and regulatory requirements and make their platforms more user-friendly.

How Securiti Can Help

It is essential for businesses to avoid the use of dark patterns in the face of increasing restrictions placed by consumer protection and data protection laws around the globe. In instances of non-compliance, businesses run the risk of incurring severe monetary penalties and other enforcement actions. In this regard, Securiti, with its multitude of data compliance and governance-related enterprise solutions, can help businesses remain legally compliant.

Specifically, Securiti’s consent management platform allows organizations to develop user-friendly interfaces which enable consumers to provide informed, specific, voluntary, and unambiguous consent regarding their choices in the digital sphere.

Request a demo today and learn more about how Securiti can help you comply with evolving consumer protection and data protection laws.

Join Our Newsletter

Get all the latest information, law updates and more delivered to your inbox


Share


More Stories that May Interest You

Videos

View More

Mitigation OWASP Top 10 for LLM Applications 2025

Generative AI (GenAI) has transformed how enterprises operate, scale, and grow. There’s an AI application for every purpose, from increasing employee productivity to streamlining...

View More

DSPM vs. CSPM – What’s the Difference?

While the cloud has offered the world immense growth opportunities, it has also introduced unprecedented challenges and risks. Solutions like Cloud Security Posture Management...

View More

Top 6 DSPM Use Cases

With the advent of Generative AI (GenAI), data has become more dynamic. New data is generated faster than ever, transmitted to various systems, applications,...

View More

Colorado Privacy Act (CPA)

What is the Colorado Privacy Act? The CPA is a comprehensive privacy law signed on July 7, 2021. It established new standards for personal...

View More

Securiti for Copilot in SaaS

Accelerate Copilot Adoption Securely & Confidently Organizations are eager to adopt Microsoft 365 Copilot for increased productivity and efficiency. However, security concerns like data...

View More

Top 10 Considerations for Safely Using Unstructured Data with GenAI

A staggering 90% of an organization's data is unstructured. This data is rapidly being used to fuel GenAI applications like chatbots and AI search....

View More

Gencore AI: Building Safe, Enterprise-grade AI Systems in Minutes

As enterprises adopt generative AI, data and AI teams face numerous hurdles: securely connecting unstructured and structured data sources, maintaining proper controls and governance,...

View More

Navigating CPRA: Key Insights for Businesses

What is CPRA? The California Privacy Rights Act (CPRA) is California's state legislation aimed at protecting residents' digital privacy. It became effective on January...

View More

Navigating the Shift: Transitioning to PCI DSS v4.0

What is PCI DSS? PCI DSS (Payment Card Industry Data Security Standard) is a set of security standards to ensure safe processing, storage, and...

View More

Securing Data+AI : Playbook for Trust, Risk, and Security Management (TRiSM)

AI's growing security risks have 48% of global CISOs alarmed. Join this keynote to learn about a practical playbook for enabling AI Trust, Risk,...

Spotlight Talks

Spotlight 46:02

Building Safe Enterprise AI: A Practical Roadmap

Watch Now View
Spotlight 13:32

Ensuring Solid Governance Is Like Squeezing Jello

Watch Now View
Spotlight 40:46

Securing Embedded AI: Accelerate SaaS AI Copilot Adoption Safely

Watch Now View
Spotlight 10:05

Unstructured Data: Analytics Goldmine or a Governance Minefield?

Viral Kamdar
Watch Now View
Spotlight 21:30

Companies Cannot Grow If CISOs Don’t Allow Experimentation

Watch Now View
Spotlight 2:48

Unlocking Gen AI For Enterprise With Rehan Jalil

Rehan Jalil
Watch Now View
Spotlight 13:35

The Better Organized We’re from the Beginning, the Easier it is to Use Data

Watch Now View
Spotlight 13:11

Securing GenAI: From SaaS Copilots to Enterprise Applications

Rehan Jalil
Watch Now View
Spotlight 47:02

Navigating Emerging Technologies: AI for Security/Security for AI

Rehan Jalil
Watch Now View
Spotlight 59:55

Building Safe
Enterprise AI

Watch Now View

Latest

View More

Building Safe, Enterprise-grade AI with Securiti’s Gencore AI and NVIDIA NIM

Businesses are rapidly adopting generative AI (GenAI) to boost efficiency, productivity, innovation, customer service, and growth. However, IT & AI executives—particularly in highly regulated...

Automating EU AI Act Compliance View More

Automating EU AI Act Compliance: A 5-Step Playbook for GRC Teams

Artificial intelligence is revolutionizing industries, driving innovation in healthcare, finance, and beyond. But with great power comes great responsibility—especially when AI decisions impact health,...

Navigating Data Regulations in India’s Telecom Sector View More

Navigating Data Regulations in India’s Telecom Sector: Security, Privacy, Governance & AI

Gain insights into the key data regulations in India’s telecom sector and how they impact your business. Learn how Securiti helps ensure swift compliance...

Best Practices for Microsoft 365 Copilot View More

Data Governance Best Practices for Microsoft 365 Copilot

Learn key governance best practices for Microsoft 365 Copilot to ensure security, compliance, and effective implementation for optimal business performance.

5-Step AI Compliance Automation Playbook View More

EU AI Act: 5-Step AI Compliance Automation Playbook

Download the whitepaper to learn about the EU AI Act & its implication on high-risk AI systems, 5-step framework for AI compliance automation and...

A 6-Step Automation Guide View More

Say Goodbye to ROT Data: A 6-Step Automation Guide

Eliminate redundant obsolete and trivial (ROT) data with a strategic 6-step automation guide. Download the whitepaper today to discover how to streamline data management...

Texas Data Privacy and Security Act (TDPSA) View More

Navigating the Texas Data Privacy and Security Act (TDPSA): Key Details

Download the infographic to learn key details about Texas’ Data Privacy and Security Act (TDPSA) and simplify your compliance journey with Securiti.

Oregon’s Consumer Privacy Act (OCPA) View More

Navigating Oregon’s Consumer Privacy Act (OCPA): Key Details

Download the infographic to learn key details about Oregon’s Consumer Privacy Act (OCPA) and simplify your compliance journey with Securiti.

Gencore AI and Amazon Bedrock View More

Building Enterprise-Grade AI with Gencore AI and Amazon Bedrock

Learn how to build secure enterprise AI copilots with Amazon Bedrock models, protect AI interactions with LLM Firewalls, and apply OWASP Top 10 LLM...

DSPM Vendor Due Diligence View More

DSPM Vendor Due Diligence

DSPM’s Buyer Guide ebook is designed to help CISOs and their teams ask the right questions and consider the right capabilities when looking for...

What's
New