California Age-Appropriate Design Code Act (CAADCA): Explained

Published أغسطس 28, 2023

1. Introduction

The California Legislature enacted the California Age-Appropriate Design Code Act (A.B. 2273) on August 30, 2022. The legislation will compel online platforms to proactively assess the privacy and protection of children in the design of any digital product or service that they offer.

The bill, which aims to regulate the collection, processing, storage, and transfer of children's data, is based on the Age Appropriate Design Code (AADC) of the United Kingdom. The California legislature considers the bill necessary as young people increasingly use digital services for entertainment, education, communication, and other objectives and are subject to targeted online advertisements.

The bill will put extensive new rules on companies that offer online services, products, or features that are "likely to be accessed” by children, as defined under the bill as anybody under 18 years of age. After being passed by both the California Senate and Assembly, the legislation is anticipated to be signed by the Governor of California and will take effect on July 1, 2024.

2. Who Needs to Comply with the Law

2.1 Material Scope

The law applies to “businesses” as defined by the California Consumer Privacy Act (CCPA). The CCPA defines businesses as any for-profit company operating in California that satisfies one of three requirements:

  1. Has an annual gross revenue of more than $25 million, or
  2. Purchases, receives for commercial purposes, sells, or otherwise makes available for commercial reasons, singly or in combination, the personal information of more than 50,000 customers, households, or devices; or
  3. Generates at least 50% of its annual revenue from the sale of personal data.

According to the California Age-Appropriate Design Code Act, any “company that provides an online service, product, or feature likely to be accessed by children” must adhere to certain rules and regulations. Children are referred to as “a consumer or consumers who are under the age of 18.”

The protections under the Act extend to all “children,” defined as consumers under the age of 18, and in respect of online products and services (i) specifically directed at children and (ii) that are “likely to be accessed” by children.

The application to users below the age of 18 is significant since the federal Children’s Online Privacy Protection Act of 1998 only applies to users below the age of 13 (and is generally focused on online services directed at children).

The California Consumer Privacy Act of 2018 and the California Privacy Rights Act of 2020, which find and declare that children are particularly vulnerable from a negotiating perspective concerning their privacy rights, both stress that it is in the public interest to ensure that children have robust privacy protections by design.

Companies should put children's privacy, safety, and well-being ahead of their own interests if there is a conflict between them and what is in their best interests.

2.2 Exemptions

The California Age-Appropriate Design Code Act makes a few exemptions. “Online service, product, or feature” does not mean any of the following:

  1. A broadband internet access service, as defined in Section 3100.
  2. A telecommunications service, as defined in Section 153 of Title 47 of the United States Code.
  3. The delivery or use of a physical product.
  4. Health care providers and medical information covered by what is commonly known as HIPAA rules and clinical trial information.

3. Definitions of key terms

3.1 Child or Children

Unless otherwise specified, a “child” or “children” means a consumer or consumers under 18 years of age.

3.2 Data Protection Impact Assessment

A systematic survey to assess and mitigate risks arising from the business's data management practices to children who are reasonably likely to access the online service, product, or feature at issue arising from the provision of that online service, product, or feature.

3.3 Default

A preselected option adopted by the business for the online service, product, or feature.

3.4 Likely to be Accessed by Children

“Likely to be accessed by children” means it is reasonable to expect, based on the following indicators, that children would access the online service, product, or feature:

  1. The online service, product, or feature is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.).
  2. The online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.
  3. An online service, product, or feature with advertisements marketed to children.
  4. An online service, product, or feature that is substantially similar or the same as an online service, product, or feature subject to subparagraph (B).
  5. An online service, product, or feature with design elements known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.
  6. A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.

3.5 Profiling

Any form of automated processing of personal information that uses personal information to evaluate certain aspects relating to a natural person, including analyzing or predicting aspects concerning a natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.

4. Obligations for Organizations Under the California Age-Appropriate Design Code Act

4.1 Data Protection Impact Assessment

Organizations must conduct a Data Protection Impact Assessment for every online service, product, or feature that is likely to be used by children before making it available to the public, and keep the documentation of this assessment for as long as the online service, product, or feature is likely to be used by children.

The Data Protection Impact Assessment should identify the objective of the online service, product, or feature, how it uses children's personal information, and the risks of material harm to children that result from the business's data management practices.

To the degree possible, all of the following must be addressed in the data protection impact assessment:

  • Whether the design of the online product, service, or feature puts children at risk for harm, mainly by exposing them to objectionable or potentially objectionable information.
  • Whether a child could experience or be targeted by harmful or potentially dangerous contacts on an online product, service, or feature because of how it was designed.
  • Whether a child could watch, engage in, or be exposed to harmful or potentially hazardous behavior on an online product, service, or feature depending on how it was designed.
  • Whether a child could be exposed to or taken advantage of by a harmful or potentially hazardous contact on an online product, service, or feature depending on how it was designed.
  • Whether the online service, feature or product uses any algorithms that might be harmful to children.
  • Whether the online product, service, or feature's targeted advertising methods could endanger children.
  • Whether and how the online product, service, or feature uses system design elements, such as automatic media playback, time-based rewards, and notifications, to boost, maintain, or extend the use of the online product, service, or feature by children.
  • Whether, how, and why the online service, feature, or feature collects or processes children's sensitive personal information.

The Data Protection Impact Assessment should be conducted within five business days after receiving a written request by the Attorney General. Additionally, the Data Protection Impact Assessment should remain confidential and not disclosed to the public, regardless of any other laws, including the California Public Records Act.

4.2 Special Requirements

To facilitate the development of online features, services, and products, businesses should consider the special requirements of various age groups, including the following developmental stages:

  • 0 to 5 years of age or “preliterate and early literacy”;
  • 6 to 9 years of age or “core primary school years”;
  • 10 to 12 years of age or “transition years”;
  • 13 to 15 years of age or “early teens”; and
  • 16 to 17 years of age or “approaching adulthood.”

The bill mandates that strong privacy protections should be provided by design and default for online services, products, or features that are likely to be used by children. This includes disabling features that use a child's past behavior, browsing history, or assumptions about their similarity to other children to profile them and present harmful content.

4.3 Data Privacy Restrictions

An organization that offers an online service, product, or feature that children are likely to access must not do any of the following:

  • Using any child's personal information in a way that the business knows, or has the basis to know, will materially harm the child's physical health, mental health, or overall wellbeing.
  • Any personal data that is not required to deliver an online service, product, or feature that a child is actively and knowingly using should not be collected, sold, shared, or kept unless the business can show that collecting, selling, sharing, or keeping the personal information is in the best interests of the children who are most likely to use the service, product, or feature online.
  • Children's precise geolocation data unless it is strictly necessary for the business to do so to deliver the service, product, or feature that has been requested, and even then, only for the short period that the collection of precise geolocation data is required to deliver the service, product, or feature.
  • Utilize unethical practices to persuade or encourage children (the use of dark patterns) to divulge more personal information than is necessary for an online service, product, or feature to work, to forego privacy protections, or to take any other action that the company knows or has reason to believe will materially harm the child’s wellbeing.
  • Collect personal information to estimate age or a range of ages for any other purpose or keep longer than required.

4.4 Documentation Requirement

Document any risk that the company's data management methods provide to children that could be materially harmful, as determined by the Data Protection Impact Assessment.

4.5 Age Assurance

The Act requires businesses to estimate the age of child users with a “reasonable” level of certainty appropriate to the risks that arise from their data management practices or to apply privacy and data protections afforded to children to all of their consumers. However, businesses are prohibited from using the personal information collected to estimate age for any other purpose or to retain such information longer than necessary.

4.6 Clear, Age-Appropriate Privacy Information

Any privacy information, terms of service, policies, or community standards must be concise, prominently displayed, and use clear language suited to the age of children likely to access the Service. Any published terms, policies, and standards the business establishes must be enforced.

4.7 Data Minimization Requirement

Strong data minimization standards would be established by the ADCA, making it illegal to collect, sell, share, or keep personal data that is not required to deliver the product or service.

Children's data is only permitted to be utilized for the purpose it was gathered. The ADCA would allow data collection for the sole purpose of determining an individual's age. Still, it would limit data use by prohibiting it for any other purpose.

4.8 Default Privacy Settings Requirement

When a child accesses digital services, the ADCA will mandate covered businesses to set "all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy protection offered by the business."

Unless the company can prove convincingly that a different option is in children's best interests, all default privacy settings offered to children by the online service, product, or feature should be configured to settings that offer a high level of privacy.

4.9 Transparency Requirement

Give any privacy information, terms of service, policies, and community standards in a clear, concise manner that is visible to children of the age group most likely to access it.

Organizations must clearly indicate to the child when they are being tracked or monitored if the online service, product, or feature has permitted the parent, legal guardian, or any other consumer to keep an eye on the child's online behavior.

5. Data Subject Rights

The California Age-Appropriate Design Code Act states that children or, if necessary, their parents or guardians should be given accessible and responsive tools which assist them in exercising their right to privacy and report concerns.

6. Regulatory Authority

The California Age-Appropriate Design Code Act would task the California Privacy Protection Agency (CPPA) to create the California Children's Data Protection Working Group and communicate privacy guidelines, standards, and information.

The Working Group would be in charge of adopting regulations by April 1, 2024, and providing compliance advice. Additionally, the Working Group will be tasked with, among other things, the following:

  • Figuring out how to better define and assess which online services or features are likely to be used by children,
  • Which "age assurance methods" are appropriate for mitigating children's online risks,
  • What language to use in privacy policies and other policies that are appropriate for children, and
  • How to assess and minimize children's online risks.

The Act mandates that by April 1, 2023, the Working Group will be in place, with members chosen by the CPPA. "Californians with knowledge in privacy, physical health, mental health, well-being, technology, and children's rights" would constitute members of the Group. Companies would have three months to comply.

7. Penalties for Non-compliance

The Attorney General would be permitted to seek a civil lawsuit against any company that violates its provisions. The proposed law would subject violators to civil penalties of up to $2,500 per impacted kid for each negligent violation and up to $7,500 for each intentional violation.

8. How an Organization Can Operationalize the Law

Organizations that process personal data of children and target children for advertisements must ensure they comply with California Age-Appropriate Design Code Act by:

  • Conducting a data protection impact assessment about how they use children’s data before they cause harm,
  • Ensuring data remains minimal to what is required for the purpose by addressing risks around data storage and complying with data retention policies,
  • Informing children that they are being monitored and tracked and the rights they have with the help of effective and dynamic privacy notices and policies,
  • Ensuring data security by taking appropriate security measures,
  • Setting all default settings to the most private,
  • Making it easier for children to report privacy concerns,
  • Live up to your policies and terms & conditions, and
  • Provide all privacy notices in clear language that children can understand.

9. How can Securiti Help

As countries witness a profound transition in the digital landscape, automating privacy and security processes for quick action is essential. Organizations must become even more privacy-conscious in their operations and diligent custodians of their customer's data.

Securiti uses the PrivacyOps architecture to provide end-to-end automation for businesses, combining reliability, intelligence, and simplicity. Securiti can assist you in complying with California Age-Appropriate Design Code Act and other privacy and security standards worldwide. Examine how it functions.

Request a demo right now.


Frequently Asked Questions (FAQs)

The California Kids Code refers to legislation that enhances online privacy and protection for children, particularly regarding personal information.

The California Age Appropriate Design Code Act (ADCA) is a legal initiative that compels online platforms to proactively assess the privacy and protection of children in the design of any digital product or service they offer.

Under the California Age-Appropriate Design Code Act, any “company that provides an online service, product, or feature likely to be accessed by children” must adhere to specific rules and regulations. Children are referred to as “a consumer or consumers who are under the age of 18.

The California Children's Code likely refers to legal provisions or regulations that specifically address the rights, protection, and privacy of children within the state.

The California Age Appropriate Design Code Act might matter to a social media company based in New York if it offers services to California residents, as it could require compliance with specific design standards for children's privacy protection.

Share

Join Our Newsletter

Get all the latest information, law updates and more delivered to your inbox

What's
New