Go Back
Image

Leonard Navarro

Vice President Of Business Development Nametag
Subscribe

California Passes Internet Age-Appropriate Design Code Act (ADCA)

The concern over the online safety of minors continues to grow. With the increased use of technology during the pandemic due to distance learning among other factors, the chances of a child being exposed to inappropriate content, cyberbullying, and online predators continue to grow. The federal government enacted the Children’s Online Privacy Protection Act (COPPA) to help protect kids under 13 by requiring parental consent for personal information, but it’s clear that CA lawmakers don’t believe it’s enough anymore.

According to industry expert Leonard Navarro, Vice President at Nametag,  “the biggest news in the identity and authentication space this week seems to be the passing of California’s Age-Appropriate Design Code Act. What this means is that organizations, particularly internet sites and applications will have to install some form of guardrails for users under 18.”

Navarro clearly understands the significance of this since Nametag is devoted to giving people control over the sharing of personal information. This helps people build more trusted relationships with internet sites.

 

New Requirements and Restrictions Effective July 1, 2024

Modeled after the Children’s Code in the United Kingdom, the ADCA (also known as the California Design Code), the act aims to make internet sites more accountable for protecting minors from internet dangers. The law is very specific and comprehensive about which businesses are affected, what they are required to do, and what they are prohibited from doing (JD Supra).
 

Business Definition:

  1. Generates $25M or more annually,
  2. Buys, sells, or shares the personal information of 50K+ consumers for commercial purposes,
  3. Receives 50% or more of its income from selling personal information.

 

Products and Services Covered:

    1. Website or online service entirely or partially aimed at children,
    2. Routinely accessed by children,
    3. A product or service that is similar to one regularly accessed by children,
    4. Advertisements marketed toward children.

 

The comprehensive law defines what is and is not allowed for companies that are defined above. In summary, it aims to protect minors’ personal data and protect them from being preyed upon by the internet sites they visit.

Some of the requirements include websites having to default to the highest privacy settings and notifying minors when they are being monitored. “It also would prohibit the use of so-called dark patterns — essentially design tricks made to steer users toward a specific choice — that would encourage minors to give away personal information that would not be necessary to provide the service (CNBC).

 

What Will the Internet Age-Appropriate Design Code Act (ADCA) Accomplish?

“These laws take away some of the bite these companies are able to complete by basically tracking and storing information on their users who are minors.,” explained Navarro. “Moreover, for users over 18, it’s likely that people are going to have to take additional steps to verify their age. This is being put in place while [users are suffering] lots of abuse, particularly from video game companies who are preying upon the data provided by minors.”

Additionally, the rules could reduce online activity designed to keep children glued to screens leading to increased mental health and a reduction in screen addiction. Increased privacy controls could keep children from being contacted by strangers on messaging platforms. These measures may seem to be coming from out of nowhere, but the reality is the foundation for change was laid in 2021.

In 2021, both Instagram and Facebook revealed information illustrating the harm social media practices are causing children. Instagram’s algorithms sent graphic images of teenage girls self-harming and promoted eating disorder content.

Frances Haugen, former Facebook product manager, testified before a Senate panel that the company “puts its own profits over users’ health and safety, which is largely a result of its algorithms’ design that steers users toward high-engagement posts that in some cases can be more harmful.” She gave the example of algorithms sending “young users from something relatively innocuous such as healthy recipes to content promoting anorexia in a short period of time.” Haugen noted during her testimony that she believed lawmakers needed to step in to keep young users safe.

California heard the cry and is setting the tone for what may be the future – a safer internet experience for children. It’s a proactive approach to protecting children rather than the current landscape of being reactive to negative outcomes.

“The success of the California statute should determine if other states also follow suit. But as we’ve seen more and more throughout the media, big social media sites and gaming sites are stumbling and are already being fined for what they’ve already done and collected on minors,” noted Navarro.

Fields with ( * ) are required

To submit a comment, please provide your name and email or sign in at MarketScale.com

200

Recent Posts

Identity and Biometrics: What’s Going on with Age Verification? Articles - Jan 19, 2023

The year 2017 marked the first time ever that half of the world’s population got online. Today, internet connection and social media use clocks in at 5 billion and 4.7 billion users respectively – over half of the world’s population and growing. For many nations around the world, internet traffic for age-appropriate content is […]

Register to MarketScale.com for Leonard Navarro episodes, events, and more.


Already have an account?