Don't Let Fearmongering Derail a New Law That Has Real Teeth to Protect Kids’ Privacy

Berkeley Professor Hany Farid writes that the arguments in favor of California's new children's privacy legislation are "clear and uncontroversial."

By
We may earn a commission from links on this page.
A child plays with an Apple iPad at Apple's headquarters.
Photo: Amy Osborne (Getty Images)

For two decades, we have watched as the tech sector rose from quirky start-ups – many of which originated here in the state of California – to global behemoths with the power to entertain, inform, manipulate, addict, and destabilize at an individual, local, and global scale.

Some of those hurt most by tech’s worst effects are also the most vulnerable among us. There is no longer any question as to the nature of the harm to children around the globe, including heightened body image issues for one-in-three teenage girls on Instagram, death and injury inspired by TikTok challenges, and the sexualization of children on YouTube.

Advertisement

Leaders have rightly taken notice of the growing mental health crisis among young people. Surgeon General Vivek Murthy has called out social media’s role in the crisis, and, earlier this year, President Biden addressed these concerns in his State of the Union address.

Advertisement

With Congress at an impasse on how to rein in the titans of tech, state legislators have rightfully stepped in. The California legislature passed the Age-Appropriate Design Code (AADC) last week, requiring the prioritization of the safety and wellbeing of children in the design of online products and services. Legislators refer to it as the “Kids’ Code.” Gov. Gavin Newsom has yet to sign the bill into law. 

Advertisement

The bill passed both the State Assembly and Senate with overwhelming bipartisan support. It’s no wonder with nine-in-ten California voters saying they support the measure, and with nearly all Americans clamoring for better protection of their data and their children online.

After decades of measurable and incontrovertible harm caused by the technology in our pockets and homes, it is disheartening to see modest regulation being met with inaccurate fear-mongering.

Advertisement

In a piece published by Capitol Weekly on August 18, for example, Eric Goldman incorrectly claims that the AADC will require mandatory age verification on the internet. The following week, Mike Masnick made the bizarre and unsubstantiated claim in TechDirt that facial scans will be required to navigate to any website.

Goldman and others have made inaccurate claims as to the nature of AADC and their implications. So let’s set the record straight.

Advertisement

The bill requires age estimation for sites likely to be accessed by children, and that age estimation needs to be relative to the risk of harm to children. For instance, a dating app will need to be confident that users on their site are adults because the risk to children is much higher than the risk posed by accessing, for example, this news outlet’s site.

Age estimation can be done in a multitude of ways that are not invasive. In fact, businesses have been using age estimation for years – not to keep children safe – but rather for targeted marketing. The AADC will ensure that the age-estimation practices are the least invasive possible, will require that any personal information collected for the purposes of age estimation is not used for any other purpose, and, contrary to Goldman’s claim that age-authentication processes are generally privacy invasive, require that any collected information is deleted after its intended use.

Advertisement

Goldman also claims – without any substantiation – that these regulations will force online businesses to close their doors to children altogether. This argument is, at best, disingenuous, and at worst fear-mongering. The bill comes after negotiations with diverse stakeholders to ensure it is practically feasible and effective. None of the hundreds of California businesses engaged in negotiations are saying they fear having to close their doors. Where companies are not engaging in risky practices, the risks are minimal. The bill also includes a “right to cure” for businesses that are in substantial compliance with its provisions, therefore limiting liability for those seeking in good faith to protect children on their service.

We should celebrate the fact that California is home to the giants of the technology sector. This success, however, also comes with the responsibility to ensure that California-based companies act as responsible global citizens. The arguments in favor of AADC are clear and uncontroversial: we have a responsibility to keep our youngest citizens safe. Hyperbolic and alarmist claims to the contrary are simply unfounded and unhelpful.

Advertisement

Hany Farid is a professor at the University of California, Berkeley, specializing in digital image analysis. He was part of the Microsoft-led team that pioneered PhotoDNA, a robust-hashing technology used globally to stop the spread of child sexual abuse imagery. An adviser to the Cyber Civil Rights Initiative and the Counter Extremism Project, Farid is also a member of the TikTok Content Advisory Council.

Advertisement