Introduction
The Australian government is taking a significant step to enhance online safety for children through the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Bill). This Bill introduces new obligations for social media platforms to enforce age restrictions, protect children under 16 from creating accounts, and ensure the privacy of personal data collected during age verification. This article provides an overview of the impact that this Bill may have on social media providers, users, and regulators once signed into law.
The Bill applies to platforms designed for online social interaction between users, particularly those enabling:
- Sharing material for social purposes (excluding business use).
- Interactions between users, such as linking accounts or posting content.
If social media platforms meet the above-stated criteria, they are considered “age-restricted.” A social media platform may also be explicitly designated as “age-restricted” under legislative rules. Essentially, the Bill encapsulates popular services like Instagram and TikTok, as both enable users to share content for social purposes, interact with others, and post material.
However, platforms that do not allow access to Australian users, such as region-specific social media services like VK (VKontakte), could be exempt from these requirements. Additionally, services excluded by legislative rules or designed for business purposes (such as LinkedIn) are not considered age-restricted under the Bill.
Platforms classified as age-restricted must take reasonable steps to prevent children under the age of 16 from creating accounts.
1. Protect Data Collected For Age Verification
To comply with the Bill, platforms must:
- Implement age assurance measures to verify users' ages.
- Age verification through official documents (e.g., submitting a government-issued ID).
- Third-party age verification services that cross-check user data with trusted databases.
- Protect the personal data collected for age verification, ensuring it is used solely for this purpose and not further processed for any other reasons.
Platforms holding personal information for age assurance must:
- Only use or disclose the data for verifying age or as permitted under the Australian Privacy Principles (APPs).
How Securiti Can Help:
Securiti’s Sensitive Data Intelligence module detects hidden data in cloud platforms, classifies sensitive information, builds a searchable catalog, links personal data to owners for reporting, and highlights data risk areas.
2. Comply with Consent Obligation
To comply with the Bill, platforms must:
- For age verification data, platforms must obtain voluntary, informed, specific, and withdrawable consent from users, and the data can only be used for age verification purposes unless additional consent is obtained.
- For other personal data, such as a user's address, the same consent requirements apply, but consent must be obtained for the specific purpose the data is collected for, following the Australian Privacy Act 1988.
How Securiti Can Help
Securiti’s Consent Management automates consent tracking and management, simplifying the management of first-party and third-party consent, enabling organizations to obtain, record, track, and manage individuals' explicit consent.
3. Comply with Data Retention and Destruction Obligation
To comply with the Bill, platforms must:
- Personal information collected for age verification must be destroyed after its intended use. Failure to destroy such information is deemed an interference with privacy under the Privacy Act 1988, making it subject to complaints.
How Securiti Can Help:
Securiti’s Redundant, Obsolete and Trivial (ROT) Data Minimization solution uses AI to identify and remove unnecessary data, reducing storage costs and ensuring compliance with retention policies. It enables organizations to leverage granular insights and discover the security posture of data assets across on-premise, IaaS, SaaS, and data clouds.
Enforcement and Penalties
The Bill grants significant powers to the eSafety Commissioner, which is Australia’s independent regulator for online safety, to monitor compliance, enforce rules, and penalize breaches:
The Commissioner can request information from:
- Age-restricted social media platforms to assess compliance.
- Providers of electronic services related to the determination of whether their platforms fall under the rules.
The maximum penalty is 30,000 penalty units, or $9.9 million, and for corporations, it rises to 150,000 penalty units, or $49.5 million. If a platform is found non-compliant under the new Bill, the e-safety Commissioner may:
- Issue a statement of non-compliance to the provider.
- Publicly name and shame non-compliant platforms on the Commissioner’s website.
Implementation Timeline and Review
The Bill provides a 12-month implementation period for platforms (envisioned within its scope) to comply with the new age restriction obligations. These rules will apply to both new and existing accounts.
To ensure the rules' effectiveness, a mandatory independent review will be conducted within two years of implementation. The review will assess the rules' impact and recommend any necessary adjustments.
Going forward
The Bill raises the bar for safeguarding children online. To avoid substantial penalties, social media platforms must prioritize age assurance systems, data protection, and compliance.
By implementing protective systems and aligning them with regulations, platforms can foster safer online environments for users while adhering to Australia’s strict privacy and safety standards.