Introduction
The Australian government is taking a significant step to enhance online safety for children through the Online Safety Amendment (Social Media Minimum Age) Act 2024 (Act). This law introduces new obligations for social media platforms to enforce age restrictions, protect children under 16 from creating accounts, and ensure the privacy of personal data collected during age verification.
Moreover, the Office of the Australian Information Commissioner (OAIC) has also released the new Privacy Guidance on the Social Media Minimum Age (SMMA), requiring additional privacy and accountability obligations for platforms and third-party age assurance providers. The guidance emphasizes the need for transparency, proportionality in data collection, and enhanced privacy safeguards.
This article provides an overview of the impact that this Act and relevant guidance may have on social media providers, users, and regulators once signed into law.
The Act applies to platforms designed for online social interaction between users, particularly those enabling:
- Sharing material for social purposes (excluding business use).
- Interactions between users, such as linking accounts or posting content.
If social media platforms meet the above-stated criteria, they are considered “age-restricted.” A social media platform may also be explicitly designated as “age-restricted” under legislative rules. Essentially, the Act encapsulates popular services like Instagram and TikTok, as both enable users to share content for social purposes, interact with others, and post material.
However, platforms that do not allow access to Australian users, such as region-specific social media services like VK (VKontakte), could be exempt from these requirements. Additionally, services excluded by legislative rules or designed for business purposes (such as LinkedIn) are not considered age-restricted under the Act.
Platforms classified as age-restricted must take reasonable steps to prevent children under the age of 16 from creating accounts.
To clarify what this means in practice, the eSafety Commissioner released the Social Media Minimum Age Regulatory Guidance in September 2025. It clarifies that "reasonable steps" must be informed by principles that include being:
- Reliable, accurate, robust, and effective.
- Privacy-preserving and data-minimising.
- Transparent.
- Proportionate
Platforms are expected to implement layered and risk-based age assurance measures, such as age verification, estimation, or inference, while avoiding over-reliance on self-declaration. They are also required to remove existing accounts held by users under 16, prevent circumvention, and establish fair review mechanisms. Importantly, the guidance emphasises that compliance extends beyond written policies to include demonstrable implementation, continuous monitoring, and improvement, supported by proper record-keeping and ongoing engagement with eSafety.
How Securiti Can Help
Securiti helps align age assurance with eSafety principles by ensuring data minimization and proportionality via automatic data classification and destruction. Securiti’s Consent Management provides necessary transparency and compliance with OAIC notices, delivering the auditability required to prove measures are reliable and robust.
2. Comply with Age Verification Measures
To comply with the Act, platforms must:
- Implement age assurance measures to verify users' ages.
- Age verification through official documents (e.g., submitting a government-issued ID).
- Third-party age verification services that cross-check user data with trusted databases.
- Protect the personal data collected for age verification, ensuring it is used solely for this purpose and not further processed for any other reasons.
- Be transparent using just-in-time notices to explain what is collected, why, how long it is retained, and the user's choices.
Platforms holding personal information for age assurance must:
- Only use or disclose the data for verifying age or as permitted under the Australian Privacy Principles (APPs).
How Securiti Can Help
Securiti’s Sensitive Data Intelligence module detects hidden data in cloud platforms, classifies sensitive information, builds a searchable catalog, links personal data to owners for reporting, and highlights data risk areas.
3. Comply with Consent Obligation
To comply with the Act, platforms must:
- For age verification data, platforms must obtain voluntary, informed, specific, and withdrawable consent from users, and the data can only be used for age verification purposes unless additional consent is obtained.
- The consent request must be written and designed so users can understand what they are agreeing to.
- The consent must be unambiguous, meaning entities cannot seek it through pre-selected settings or opt-outs.
- Secondary use and disclosures of data should be strictly optional and easily withdrawn.
- For other personal data, such as a user's address, the same consent requirements apply, but consent must be obtained for the specific purpose the data is collected for, following the Privacy Act 1988.
How Securiti Can Help
Securiti’s Consent Management automates consent tracking and management, simplifying the management of first-party and third-party consent, enabling organizations to obtain, record, track, and manage individuals' explicit consent.
4. Comply with Data Retention and Destruction Obligation
To comply with the Act, platforms must:
- Personal information collected for age verification must be destroyed after its intended use. There is no allowance for de-identification; the information must be destroyed.
- Age assurance inputs (e.g., document images, selfies, biometric information) must be destroyed immediately once the purpose of age assurance has been met.
- Failure to destroy such information is deemed an interference with privacy under the Privacy Act 1988, making it subject to complaints.
How Securiti Can Help
Securiti’s Redundant, Obsolete and Trivial (ROT) Data Minimization solution uses AI to identify and remove unnecessary data, reducing storage costs and ensuring compliance with retention policies. It enables organizations to leverage granular insights and discover the security posture of data assets across on-premise, IaaS, SaaS, and data clouds.
Enforcement and Penalties
The Act grants significant powers to the eSafety Commissioner, which is Australia’s independent regulator for online safety, to monitor compliance, enforce rules, and penalize breaches:
The Commissioner can request information from:
- Age-restricted social media platforms to assess compliance.
- Providers of electronic services related to the determination of whether their platforms fall under the rules.
The maximum penalty is 30,000 penalty units, or $9.9 million, and for corporations, it rises to 150,000 penalty units, or $49.5 million. If a platform is found non-compliant under the Act, the e-safety Commissioner may:
- Issue a statement of non-compliance to the provider.
- Publicly name and shame non-compliant platforms on the Commissioner’s website.
Implementation Timeline and Review
The Act provides a 12-month implementation period for platforms (envisioned within its scope) to comply with the new age restriction obligations. These rules will apply to both new and existing accounts.
To ensure the rules' effectiveness, a mandatory independent review will be conducted within two years of implementation. The review will assess the rules' impact and recommend any necessary adjustments.
Going forward
The Act raises the bar for safeguarding children online. To avoid substantial penalties, social media platforms must prioritize age assurance systems, data protection, and compliance.
By implementing protective systems and aligning them with regulations, platforms can foster safer online environments for users while adhering to Australia’s strict privacy and safety standards.