Privacy at an Inflection Point: From Consent to Control
Across jurisdictions, privacy regulation is shifting from notice-and-consent models to enforcement-led accountability. Regulators are no longer satisfied with formal compliance—they are scrutinizing outcomes, user friction, and real-world impact.
Three clear trends are emerging. First, sensitive data and vulnerable groups (children, health, biometrics) are becoming enforcement priorities, with stricter thresholds and, in some cases, outright prohibitions. Second, design choices are now regulatory risk: dark patterns, default settings, and product architecture are being treated as compliance failures, not UX decisions. Third, operational readiness is under pressure, with tighter timelines for breach detection, DPIAs, and governance obligations.
Looking ahead, organizations should expect less flexibility, faster enforcement, and deeper scrutiny of data ecosystems, including vendors and adtech. Privacy is no longer a legal checkbox; it is becoming a core product and risk function.
North & America Jurisdiction
1. FTC Takes Action Against Match Group and Subsidiary OKCupid
March 31, 2026 United States
On March 31, 2026, the Federal Trade Commission (FTC) took enforcement action against Match Group and its subsidiary OkCupid for allegedly misleading users about its data practices.
The FTC alleges that OkCupid shared highly sensitive user data, including information relating to sexual orientation, political views, and drug use, with third-party analytics and advertising partners, despite representing that such data would remain private. The complaint also highlights failures to implement reasonable data security measures and allegations that account deletion processes were unnecessarily burdensome.
This action reinforces the FTC’s focus on misrepresentation of privacy practices, particularly involving sensitive user attributes, and signals continued scrutiny of third-party data sharing and user control mechanisms in consumer platforms.
A U.S. federal court has approved a class-action settlement requiring Google to introduce a user control limiting data shared in real-time bidding (RTB) auctions. When enabled, the control removes identifiers such as user IDs, device advertising IDs, and IP addresses from bid requests, restricting tracking and targeted advertising.
The case alleged that Google’s RTB system shared personal data with numerous third parties in ways that conflicted with its privacy representations. While approving the settlement, the court noted its limited impact, emphasizing that the control requires affirmative activation and may see low user uptake.
This development highlights increasing scrutiny of adtech data-sharing practices and raises ongoing questions about the real-world effectiveness of user-driven privacy controls.
On March 23, 2026, South Dakota enacted a new genetic data privacy law aimed at safeguarding consumer genetic information. The Act applies to direct-to-consumer genetic testing companies and will take effect on July 1, 2026.
The law requires companies to provide clear privacy notices and obtain express, granular consent for the collection, use, and sharing of genetic data, including separate consent for third-party disclosures, secondary uses, sample retention, and marketing. Consumers must be able to withdraw consent, with companies required to destroy biological samples within 30 days.
The Act also grants rights to access and delete genetic data and mandates reasonable security safeguards. Certain entities, including HIPAA-covered organizations and law enforcement labs, are exempt. Violations may result in penalties of up to $5,000 per violation.
4. Oklahoma Enacts Comprehensive Consumer Data Privacy Law
March 20, 2026 Oklahoma, United States
Oklahoma has enacted a comprehensive consumer privacy law, joining the growing list of U.S. states adopting GDPR-inspired frameworks. The law applies to businesses meeting certain thresholds, including processing data of 100,000 consumers or deriving significant revenue from data sales.
It grants standard consumer rights, including access, correction, deletion, portability, and the ability to opt out of data sales, targeted advertising, and certain profiling. Notably, “sale” is narrowly defined as monetary exchange only, and the law does not require recognition of universal opt-out signals.
Controllers must implement data minimization, security safeguards, and conduct DPIAs for high-risk processing, including targeted advertising and sensitive data use.
The law takes effect January 1, 2027, with enforcement by the Oklahoma Attorney General and penalties of up to $7,500 per violation.
A New Mexico jury has ordered Meta to pay $375 million in civil penalties after finding the company misled consumers about platform safety and enabled harm, including child sexual exploitation.
The case, brought by the state under consumer protection laws, alleged that Meta failed to adequately address known risks on its platforms and misrepresented its safety practices. The jury imposed the maximum statutory penalty of $5,000 per violation.
The ruling is notable as one of the first jury decisions holding a platform liable for harms linked to its product design, rather than user-generated content protections under Section 230. Meta has stated it will appeal.
The decision signals increasing regulatory and litigation risk for platforms around child safety, product design, and transparency of risk disclosures.
6. HHS OCR Settles HIPAA Case Involving 15M-Record Breach
March 5, 2026 United States
The HHS Office for Civil Rights (OCR) has reached a settlement with MMG Fusion, LLC, following a breach affecting approximately 15 million individuals.
OCR found potential violations of HIPAA’s Privacy, Security, and Breach Notification Rules, including failure to conduct a risk analysis, delayed breach notification, and impermissible disclosure of protected health information (PHI). The incident involved unauthorized access to sensitive data, later exposed on the dark web.
Under the settlement, MMG will implement a three-year corrective action plan and pay $10,000, with OCR noting the company’s financial condition.
This marks another enforcement action under OCR’s Risk Analysis Initiative, reinforcing that risk assessments and timely breach notification remain core compliance obligations for business associates.
7. California Regulator Fines Ford Over Opt-Out Friction
March 5, 2026 California, United States
The California Privacy Protection Agency has ordered Ford Motor Company to pay a $375,703 fine and revise its practices for violating the California Consumer Privacy Act (CCPA).
The regulator found that Ford introduced unnecessary friction in the opt-out process by requiring users to verify their email before exercising their right to opt out of the sale or sharing of personal data. As a result, requests were not processed unless this additional step was completed.
Under the settlement, Ford must simplify its opt-out mechanisms, audit its tracking technologies, and ensure compliance with opt-out preference signals such as the Global Privacy Control.
This action reinforces that opt-out rights must be easy to exercise, and that adding procedural barriers may constitute a violation of CCPA requirements.
8. Virginia Moves to Ban Sale of Precise Location Data
March 2, 2026 Virginia, United States
Virginia is set to enact legislation prohibiting the sale of precise geolocation data, marking a significant shift from its current privacy framework. The bill, SB 338, is expected to be signed into law by the Governor.
The measure departs from the existing Virginia Consumer Data Protection Act (VCDPA), which allows the processing and sale of precise location data with user consent. Instead, the new law introduces a stricter approach by banning such sales outright, regardless of consent.
The development signals increasing regulatory focus on highly sensitive data categories, particularly precise location data, and reflects a move away from consent-based models toward outright prohibitions in certain contexts.
9. PlayOn Sports Fined $1.1M for Student Data Privacy Violations
March 2, 2026 California, United States
The California Privacy Protection Agency has fined PlayOn Sports $1.1 million for violations of the California Consumer Privacy Act (CCPA), marking its first enforcement action involving student data.
The regulator found that PlayOn used tracking technologies for targeted advertising while requiring users, including students, to accept tracking to access tickets and services, without providing a valid opt-out mechanism. The company also failed to recognize opt-out preference signals and relied on third-party tools instead of offering its own opt-out process. Under the settlement, PlayOn must implement proper opt-out mechanisms, enhance transparency, conduct risk assessments, and obtain opt-in consent for data sharing involving minors aged 13-16.
This action highlights heightened scrutiny on children’s data, dark patterns, and adtech practices in educational contexts.
10. DSA Guidance Clarifies Age Assurance and Child Protection Measures
March 24, 2026 Spain
European regulators, including the Spanish Data Protection Agency and the National Commission for Markets and Competition, have issued guidance on implementing Article 28 of the Digital Services Act (DSA), focusing on protecting minors online.
The guidance emphasizes privacy-preserving age assurance, encouraging solutions such as anonymized tokens and the EU Digital Identity Wallet over intrusive methods like ID collection or facial analysis. Platforms must adopt proportionate, risk-based approaches, ensuring high-risk services (e.g., adult content) apply stronger age verification, while avoiding unnecessary data collection
Importantly, the DSA does not require platforms to identify users to verify age, reinforcing alignment with GDPR principles such as data minimization.
The guidance underscores that child protection must be built into service design, balancing safety with fundamental rights through privacy-by-design approaches.
11. ICO Publishes Guidance On Recognized Legitimate Interest
March 23, 2026 United Kingdom
The Information Commissioner's Office (ICO) has issued updated guidance on “recognized legitimate interest” under the UK GDPR.
This lawful basis applies to a limited set of pre-approved public interest purposes, including crime prevention, national security, safeguarding, emergencies, and certain information-sharing with public authorities. Unlike standard legitimate interests, organizations do not need to conduct a balancing test, as this has already been determined by law. However, organizations must still demonstrate necessity, ensure transparency, and comply with all other data protection obligations. The ICO also clarified that this basis is not available to public authorities when performing their core public tasks.
The guidance provides greater clarity on when organizations can rely on this streamlined lawful basis, particularly in high-risk or public interest scenarios.
12. CNIL Clarifies Limits on Audio Recording in Video Surveillance
March 20, 2026 France
The CNIL has clarified that audio recording as part of video surveillance systems is generally prohibited due to risks to privacy and freedom of expression.
However, standalone audio recording devices may be permitted in limited cases, such as responding to safety incidents, provided they are not continuously recording or linked to video systems, and are activated manually by staff.
Any use must be exceptional, necessary, and proportionate, with strict safeguards including limited retention (only in case of incidents), clear user and employee notice, and restricted access. Continuous monitoring of employees is expressly discouraged.
The guidance reinforces that even security-driven measures must comply with GDPR principles, particularly data minimization and proportionality.
13. CJEU Clarifies When DSARs Can Be Refused as “Excessive”
March 20, 2026
The Court of Justice of the European Union (CJEU) has clarified when data subject access requests (DSARs) may be refused as “excessive” under GDPR.
In Brillen Rottler v TC (Case C-526/24), the Court held that even a first DSAR can be deemed excessive if made with abusive intent, such as to manufacture grounds for compensation rather than to verify data processing. Controllers may assess factors including timing, conduct, and patterns of repeated claims, but bear the burden of proof. The Court also clarified that compensation under Article 82 GDPR requires actual damage and a causal link, which may be broken if the data subject’s own conduct caused the alleged harm.
The ruling reinforces that refusals remain exceptional and must be strictly justified.
14. CNIL Issues Guidance on Web Filtering Proxy Servers
March 13, 2026 France
The CNIL has issued recommendations on the use of web filtering proxy servers, emphasizing GDPR-compliant deployment of cybersecurity tools.
While such tools help organizations meet security obligations under GDPR (e.g., Article 32), CNIL stresses that they involve personal data processing and must comply with principles such as lawful basis, data minimization, transparency, and limited retention. The guidance highlights risks associated with advanced features like HTTPS decryption and behavioral analysis, cautioning against excessive monitoring of employees. It also encourages a privacy-by-design approach, particularly for solution providers.
The recommendations apply to employers deploying such tools for professional internet access, reinforcing the need to balance cybersecurity objectives with employee privacy rights.
15. Intesa Sanpaolo Fined €17.6M Over Unlawful Customer Profiling
March 12, 2026 Italy
The Italian Data Protection Authority has fined Intesa Sanpaolo €17.6 million for unlawfully processing the data of approximately 2.4 million customers during a migration to its digital subsidiary, Isybank.
The regulator found that the bank conducted profiling without a valid legal basis to select customers for transfer, based on criteria such as age, digital usage, and financial status. The migration resulted in significant changes, including new IBANs and a shift to app-only banking. The Authority also highlighted insufficient transparency, noting that customers were not adequately informed of the transaction or its implications.
The decision underscores heightened scrutiny on profiling, transparency, and customer impact in large-scale data-driven business transformations.
The Luxembourg court has annulled a €746 million GDPR fine imposed on Amazon, but referred the case back to the CNPD for reassessment.
While overturning the penalty, the court upheld the underlying GDPR violations, finding procedural shortcomings in the regulator’s decision—specifically, failure to assess whether the infringement was intentional or negligent, and whether a less severe sanction was appropriate.
The case, linked to online behavioral advertising practices, will now be reconsidered by the CNPD in line with updated legal standards. Both parties indicated that the issues have since been addressed.
The ruling highlights increased judicial scrutiny of GDPR enforcement procedures, particularly around penalty justification and proportionality.
17. CNIL Releases FAQs on EU‑Recognized Data Altruism Organizations (OADs)
March 6, 2026 France
The CNIL has released FAQs on EU-recognized data altruism organizations (OADs) under the Data Governance Act (DGA).
OADs enable voluntary data sharing for public interest purposes (e.g., research or climate initiatives) within a regulated framework. To qualify, entities must be non-profit, legally independent from commercial actors, and ensure clear separation of altruistic activities.
The guidance clarifies that while OADs may use processors, they must comply with GDPR requirements, including Article 28, and prevent any reuse of data for commercial purposes. Financially, OADs may charge proportionate fees and compensate data contributors, but cannot distribute profits, which must be reinvested.
The framework aims to support a trusted and transparent data-sharing ecosystem across the EU.
18. Denmark’s Datatilsynet Outlines GDPR Rules For Political Parties During Election Campaigns
March 6, 2026 Denmark
Denmark’s Datatilsynet has issued guidance on how political parties may process personal data during election campaigns.
The authority emphasized that sensitive data, including political opinions, can only be processed where individuals have clearly made it public or where a specific legal exemption applies. Notably, an open social media profile does not qualify as public disclosure.
Parties are also prohibited from inferring political views through online behavior for targeted advertising without explicit, GDPR-compliant consent. Any consent must be freely given, specific, informed, and easily withdrawable. The guidance further reinforces obligations around transparency, data minimization, and accountability, including oversight of volunteers and third-party processors.
This highlights heightened scrutiny on political profiling and targeted campaigning practices under GDPR.
19. Spain’s AEPD Fines FC Barcelona For Missing DPIA on Biometric Member Verification System
March 4, 2026 Spain
Spain’s AEPD has fined FC Barcelona for failing to conduct a Data Protection Impact Assessment (DPIA) before deploying a biometric verification system.
The system used facial and voice recognition to verify members and enable services such as ticketing. The regulator classified it as high-risk processing and automated decision-making, triggering mandatory DPIA requirements under GDPR. The AEPD rejected the club’s post-implementation “risk analyses,” noting they were conducted after deployment and did not assess less intrusive alternatives or key risks.
The decision reinforces that DPIAs must be prospective, substantive, and proportionate, particularly for biometric systems, regardless of whether alternative non-biometric options are offered.
20. Court Awards Damages Against Meta for Unlawful Tracking
March 2, 2026 Germany
A German court has ordered Meta to pay €3,000 in damages to a user for unlawful tracking practices.
The court found that Meta’s tracking tools embedded on third-party websites enabled monitoring of users’ activity, even without login or valid consent, violating core GDPR principles. The ruling highlighted risks where such tracking could reveal sensitive data, including health-related information inferred from browsing behavior. In addition to damages, Meta was ordered to provide access to all personal data collected and delete it.
The decision reinforces growing judicial scrutiny of cross-site tracking, inferred sensitive data, and consent validity under GDPR. The ruling is not yet final and may be appealed.
21. UK ICO Launches Interactive Tool To Assess International Data Transfers
March 1, 2026 United Kingdom
The Information Commissioner's Office (ICO) has launched an interactive tool to help organizations assess whether their data transfers qualify as “restricted transfers” under the UK GDPR.
Based on a short questionnaire, the tool guides users through a three-step test to determine whether transfer rules apply when personal data is sent or made accessible outside the UK. It also directs users to relevant guidance depending on their scenario. The ICO notes that the tool is not legally binding but aims to support compliance by clarifying when safeguards are required to protect personal data transferred internationally.
This initiative reflects continued regulatory focus on cross-border data flows and practical compliance support for organizations.
22. South Korea Introduces Risk-Based Overhaul of Pseudonymized Data Processing Guidelines
March 31, 2026 South Korea
South Korea’s Personal Information Protection Commission (PIPC) has overhauled its guidelines on pseudonymized data to reduce regulatory barriers and support AI development.
The revised framework introduces standardized, three-tier risk classifications based on data use and processing environments, replacing previously inconsistent, case-by-case assessments. It also significantly reduces compliance burden by streamlining documentation and simplifying procedures for lower-risk use cases.
Notably, the guidelines allow reuse of pseudonymized data for similar purposes, introduce flexible retention standards, and permit sample-based inspections for large-scale unstructured datasets such as images and text.
The reform signals a shift toward enabling data-driven innovation while maintaining risk-based safeguards, reflecting growing global efforts to balance privacy with AI advancement.
23. Vietnam Advanced Draft Decree on Cybersecurity & Personal Data Protection
March 18, 2026 Vietnam
After the implementation of the Personal Data Protection Law (PDPL) and Decree 356/2025/ND-CP (both effective January 1, 2026), Vietnam’s Ministry of Public Security (MPS) advanced a comprehensive Draft Decree on Administrative Sanctions in cybersecurity and personal data protection.
The draft framework is expected to strengthen enforcement by defining penalties for non-compliance across data processing, security obligations, and regulatory requirements.
This development signals Vietnam’s continued move toward a more structured and enforceable data protection and cybersecurity regime, with increased emphasis on accountability and regulatory oversight.
24. Australia’s OAIC Issues Guidance On Age Assurance Technologies
March 17, 2026 Australia
Australia’s Office of the Australian Information Commissioner has issued guidance on age assurance technologies following new rules on age-restricted content and proposed social media age limits.
The guidance requires organizations to ensure age-checking methods comply with the Privacy Act 1988, emphasizing a privacy-by-design approach. This includes conducting Privacy Impact Assessments, limiting data collection (e.g., using binary age confirmation), and implementing strong data retention and deletion controls. The regulator also cautions against intrusive verification methods where less data-intensive alternatives are available.
This reflects growing global focus on balancing child safety measures with data minimization and proportionality principles.
25. New Cybersecurity Law Implementation Framework Announced
March 16, 2026 Vietnam
Vietnam has advanced implementation of its 2025 Cybersecurity Law, with the government issuing a roadmap to operationalize the framework. The Ministry of Public Security has been designated as the lead authority, tasked with drafting key decrees and overseeing enforcement.
The implementation plan includes a comprehensive legal review and the development of regulations covering areas such as administrative sanctions and cybersecurity controls. The law also introduces provisions addressing emerging risks, including the misuse of AI technologies (e.g., deepfakes), and establishes a multi-tier cybersecurity force to strengthen national digital security capabilities.
This development reflects Vietnam’s continued expansion of centralized cybersecurity governance and regulatory enforcement.
The Office of the Australian Information Commissioner (OAIC) has issued a formal response to the Administrative Review Tribunal’s (ART) landmark decision regarding the Bunnings Group Limited’s use of facial recognition technology (FRT).
The OAIC acknowledged that while Bunnings can use FRT to combat significant retail crime, it did fail to notify customers and lacked appropriate governing policies. Moreover, it added that Privacy Act safeguards apply to biometric data even if it is only held for mere milliseconds.
The ruling confirms that privacy protections apply instantly upon data capture, regardless of storage duration, and mandates that retail security needs never exempt a business from transparent notification and strict governance.
27. South Korea Proposed PIPA Amendments Tighten Breach Notification Rules
March 2, 2026 South Korea
South Korea has introduced proposed amendments to its Personal Information Protection Act (PIPA), significantly strengthening breach notification obligations.
The proposal replaces the current requirement to notify “without delay” with a strict 24-hour deadline for notifying both regulators and affected individuals. It also introduces a “reasonable care” standard, meaning organizations may be deemed aware of a breach not only when confirmed, but when it should have been identified through adequate monitoring and diligence.
The changes aim to reduce ambiguity and shift organizations toward proactive detection and rapid response frameworks. If adopted, the amendment would mark a stricter enforcement approach, increasing accountability for delayed breach detection and reporting.
28. Singapore Digital Economy Agreement with EFTA Enters Into Force
March 1, 2026 Singapore
The EFTA-Singapore Digital Economy Agreement (ESDEA) has entered into force between Singapore and Norway, introducing binding rules to facilitate cross-border digital trade.
The agreement promotes free and secure data flows, limits unjustified data localization requirements, and supports interoperability of digital systems, particularly in sectors such as financial services. Building on the existing free trade framework, the ESDEA aims to reduce regulatory fragmentation and enable smoother digital operations between participating jurisdictions. The agreement remains subject to ratification by Iceland, Liechtenstein, and Switzerland.
This development reflects the continued push toward international alignment on digital trade and cross-border data governance.
WHAT'S NEXT: Key Privacy Developments to Watch For
EU DMA-GDPR Alignment: The European Commission and European Data Protection Board are expected to finalize joint DMA-GDPR guidelines in Q4 2026, following strong stakeholder support for regulatory coherence.
Germany Product Liability Reform: Germany is advancing reforms to classify software and AI as products, introduce post-market liability, and ease evidence access for claimants.
EU Children’s Online Safety Agenda: A Commission-led panel is shaping future rules on age restrictions, safety-by-design, and addictive design, with recommendations due later in 2026.
EU Digital Sovereignty Push: A coalition led by Telefónica is building a federated EU cloud and AI infrastructure to reduce reliance on foreign providers.
Cyber Resilience Act Guidance: The European Commission has opened consultation on draft guidelines (deadline: April 13, 2026).
Adtech & UX Scrutiny (France): The CNIL is consulting on session replay tools, reinforcing consent and data minimization requirements.
Thailand Breach Rules: Thailand’s Personal Data Protection Committee will finalize breach guidelines in April 2026.
U.S. Age Verification & CPPA Rulemaking: States like Kansas and Utah are advancing app store age controls, while the California Privacy Protection Agency is consulting on reducing friction in privacy rights (deadline: April 6, 2026).
Following Veeam’s acquisition of Securiti, the launch of Agent Commander marks an important step toward helping enterprises adopt AI agents with greater confidence. In...
Generative AI (GenAI) has transformed how enterprises operate, scale, and grow. There’s an AI application for every purpose, from increasing employee productivity to streamlining...
With the advent of Generative AI (GenAI), data has become more dynamic. New data is generated faster than ever, transmitted to various systems, applications,...
Accelerate Copilot Adoption Securely & Confidently Organizations are eager to adopt Microsoft 365 Copilot for increased productivity and efficiency. However, security concerns like data...
As enterprises adopt generative AI, data and AI teams face numerous hurdles: securely connecting unstructured and structured data sources, maintaining proper controls and governance,...
What is CPRA? The California Privacy Rights Act (CPRA) is California's state legislation aimed at protecting residents' digital privacy. It became effective on January...
How HPE Private Cloud AI, NVIDIA acceleration, and Veeam Securiti Gencore AI support secure, governed enterprise AI with policy enforcement across RAG, assistant, and agentic workflows.
In a continued celebration of impactful collaboration in DataAI Security, Securiti.ai, a Veeam company, has honored Accenture as its 2025 Partner of the Year....
Businesses can take some vital lessons from the recent biggest enforcement action in CCPA history. Securiti’s blog covers all the important details to know.
Explore how the Health Insurance Portability and Accountability Act (HIPAA) applies to Artificial Intelligence (AI) in securing Protected Health Information (PHI). Learn how to...
Minimize data exposure in AI agents and copilots. Apply privacy guardrails like data minimization, access controls, masking, and policy enforcement to prevent leakage and...
Access the whitepaper and discover how unified DataAI security turns data governance into a business enabler, boosting AI innovation with visibility, compliance, and risk...
Learn how Agent Commander detects AI agents, protects enterprise data with runtime guardrails, and undoes AI errors - enabling secure, compliant AI adoption at...
Learn how to prepare enterprise data for safe Gemini Enterprise adoption with upstream governance, sensitive data discovery, and pre-index policy controls.
Securiti's eBook is a practical guide to HITRUST certification, covering everything from choosing i1 vs r2 and scope systems to managing CAPs & planning...