A state takes the lead in protecting U.S. kids online

A new California code requires social media platforms to proactively mitigate risks to young people when designing solutions, rather than leaving users to figure it out on their own.

A state takes the lead in protecting U.S. kids online
Photo by Noiseporn / Unsplash

You know your endless Twitter or Instagram scrolling isn't your fault, right? At least not entirely.

There are forces at work that make scrolling difficult to resist, put in place by the many behavioral scientists who work at Meta, Twitter, Snapchat, LinkedIn and other social media sites.

Thankfully, you're an adult with lots of experience resisting temptation so you can tell yourself to shut the app down. But imagine if your brain was still developing, like most students' are?

Who's keeping platforms in check?

During my time with Cybersecurity for Democracy I listened to co-director Laura Edelson testify on how to regulate social media platforms so we can better avoid harms like disinformation and extremist content. The hearings were held by UK and EU parliaments – both ahead of the U.S. in developing standards – and U.S. Congressional committees, where I saw unusual bipartisan support for change, especially on behalf of kids.

Guess who actually managed to push social media regulation through a U.S. legislative body, though?

Two members of the California state assembly.

The California Age-Appropriate Design Code

CAADC website, https://californiaaadc.com/

This week the California Age-Appropriate Design Code passed the state Senate by unanimous vote and is headed to the governor's desk. Authored by Buffy Wicks (D) and Jordan Cunningham (R), CAADC says that tech firms must proactively mitigate risks to young people when designing their solutions, rather than leaving youth, parents, schools and vendors to figure out how to do so on their own.

A bipartisan team. A unanimous vote. Who knew that was still possible in the U.S.?

CAADC sets standards for websites and apps that are "likely to be accessed" by kids. It's aimed at social media, though apparently will impact other sites as well.

It requires that platforms take steps like:

  • Making high-privacy settings the default, to limit the collection and use of data about users under 18.
  • Turning off geolocation as default, to avoid sites targeting young people with content or ads based on where they live or the places they frequent.
  • Prohibiting the use of "nudge techniques" – behavioral science-informed features that get us to open the app, tap a post or keep scrolling – for users under 18.

Finally, some help.

How will it work?

Social media platforms and their advertisers know your home state, but the content itself knows no borders. So how will it work for one state to impose standards, and not others?

That's the million dollar question.

The inspiration

As home to the big tech firms, California seems like an appropriate place to keep them in check. But CAADC was inspired by, and named after, legislation from the UK – whose grace period expired at this time last year and is now being enforced.

The UK Age-Appropriate Design Code is credited with pushing tech firms to make privacy changes like YouTube turning off auto-play for users under 18 – a move recommended by Guillaume Chaslot, the former YouTube engineer who runs AlgoTransparency.org, to deter online radicalization. Read more about the UK code's influence in this Axios piece. ⬇️

Spooked by new laws, tech doubles down on kids’ privacy
A new U.K. regulation is prompting big online platforms to tighten their rules protecting young users.

The opposition

Industry association TechNet aptly questioned how to determine whether a site is "likely to be accessed" by children; another opponent is the CA Chamber of Commerce. Others worry that websites of all kinds will be racing to set up systems to verify users' ages, wreaking havoc and causing headaches for consumers, already tired of being constantly asked to accept cookies. ⬇️

Sweeping Children’s Online Safety Bill Is Passed in California
The new rules, which would require many online services to increase protections for children, could change how popular social media and game platforms treat minors.

Still others say a state-by-state approach to tech regulation doesn't work – something I've pointed out myself given what's happening with abortion access. Continuing to fracture our culture across state lines just isn't sustainable.

What do young people think?

Of the teens in my household, one will defend social media to the hilt and the other acknowledges pros and cons, like I do. I was pleased to see that young people played a key role in the CAACD effort, as active members of the coalition advocating for the bill. Justin Hendrix gave them the mic in his Tech Policy Press podcast and newsletter; check it out and file their names for future reference. I bet we'll see #DesignItForUs again.

What's next?

Some possibilities:

  • Other states may follow suit. After California passed its Consumer Privacy Act in 2018, states like Colorado, Connecticut, Virginia and Nevada did the same.
  • Federal action could ensue. Senator Klobuchar's proposal from last February, which would launch a study by the National Science Foundation and National Academies of Science, Engineering and Medicine on content-agnostic approaches (that is, methods other than content moderation that help users avoid manipulative posts) has been endorsed by Cybersecurity for Democracy's Edelson and is one to watch.
  • Platforms may be pressed to put additional protections in place. Advocates have already published a long list of reforms they say were prompted by the CAADC debate.

What can you do?

Tracking this legislation isn't just for wonks; it affects all social media users. While we wait for provisions to take hold, there are simple things every parent or individual user can and should do:

Turn off auto-play where you can. Here's a platform-by-platform guide -- please note that Instagram, where I find auto-play most annoying, doesn't let you turn it off.
Disable push notifications and emails from social media platforms. On iPhone, go to Settings - Notifications and change the setting to off for each platform. On the Facebook menu, go to Settings & Privacy - Settings - Notification settings; you can mute ALL push notifications by toggling the button at the top. Other platforms are similar. If you have kids, urge them to do the same.
Sign up for a Truth in Common workshop or info session and learn more about how to avoid false and harmful content. For more info visit truthincommon.org

Let me know what you think!