President Biden began the year calling on Congress to pass stronger digital privacy protections for minors. Now, with just two weeks left in 2022, there is talk that the Senate is looking to fulfill that request by adding two major children’s online privacy and safety bills to the potential omnibus spending package.
Protecting young people online is a challenging task. Both digital privacy and content moderation have proven challenging in a legislative, constitutional, and technical context. Adding these bills to the omnibus spending package does not give the topics the due process and consideration they deserve. A better focus for Congress would be to pass universal privacy protections and then augment those protections with youth privacy and online safety rules.
Though the bills have different sponsors and come from different committees, together they address the two most important areas of online safety for children: privacy and child-appropriate content. One, the Children and Teens’ Online Privacy Protection Act, would protect the online privacy of anyone under 16 years old. The second bill, the Kids Online Safety Act (KOSA), is a youth content moderation bill that seeks to prevent young people from seeing harmful content on the websites they access.
The Children and Teens’ Online Privacy Protection Act is an update to the current children’s privacy regulation, the 1998 Children’s Online Privacy and Protection Act (COPPA). The current regulation is straightforward. It applies to companies knowingly handling data of youth under 13 years old. To comply with COPPA, firms must gain consent from parents before collecting children’s data and “implement reasonable procedures to protect the security” of that data. The Children and Teens’ Online Privacy Protection Act updates the current rules, taking into account the changes in the digital landscape in the last 25 years.
First, the new bill would increase the age requirement for parental consent to 16. Next, it limits the use of children’s personal data and adds privacy and security requirements for children’s data. The bill makes it illegal to show targeted ads to youth, adds a digital marketing bill of rights to limit data collection of minors, and requires platforms to create privacy dashboards for parents to see how sites use youth data.
As PPI has previously written, in its current form, COPPA has always been difficult to enforce because there is no easy way to check if a website user is under 13 years old. The new legislation, while providing pragmatic and relevant updates to the original text, risks not only reproducing that challenge, but increasing it because the Children and Teens’ Online Privacy and Protection Act applies to more young people. Instead, we advocate for universal privacy protections for all Americans that can be updated, as needed, for children.
The other bill under consideration is the Kids Online Safety Act. This bill follows California and the United Kingdom, both of which have passed youth content moderation protections. It applies to any online software with child users and can be broken down into three areas. First, a blanket duty of care for platforms to act in the best interest of minors using the site by demoting content that could be harmful to them. The bill defines harmful content as promoting eating disorders, self-harm, bullying, sexual exploitation of youth, or alcohol or addiction content. Next, it requires safeguards, or “parental controls,” on youth accounts. The controls would provide extra privacy, data, and content ranking settings. Finally, it requires audits from companies on their online safety risks and practices, gives the Federal Trade Commission enforcement powers, and establishes a Kids Online Safety Council to advise and implement the Act.
This bill gets a few things right. First, it defines a set of harmful content, making it easier for companies to know what is and isn’t covered. The requirement to add additional settings for parental control on individual online accounts is also valuable. It creates an easy opt-in tool for parents to better manage their children’s accounts.
The challenge is in the duty of care mandate. In theory, as long as platforms can demonstrate in their audits that they are taking appropriate steps to prevent youth from accessing harmful content, they will be safe from lawsuits. However, addressing content moderation is challenging. The First Amendment protects the right to freedom of speech with minimal exceptions, giving private companies a broad mandate to define what content is shown on their platforms. And many companies already go above and beyond in shielding all users from the content defined in KOSA.
Harmful content violates the terms and conditions of most platforms and is already moderated. Companies do their best to demote and remove videos included in the “harmful content” definition, but it is currently technically impossible to do this work with 100% accuracy. Content that is bullying or harassing can happen in real-time, like in chat rooms on video games. If a video game has users under 16, which many do, the company could censor certain words in the chat, but online language is constantly evolving and uses new abbreviations and emoji combinations to get around censorship.
It’s unclear what additional steps firms will need to take to comply with KOSA’s broad mandate. There are currently no enforceable age restrictions on the internet, meaning KOSA elicits the same enforcement challenges as COPPA. Some sites might choose to be more heavy-handed, such as restricting users under 16 from making accounts (as many sites now do for kids under 13 due to COPPA requirements), requesting proof of age when signing up, or moderating content, including user-generated content that depicts legal products for adults like alcohol and cigarettes, for everyone to protect youth.
There is little doubt that young people need protection from harmful internet content. The proposed bills are broad in scope, applying to any website, social media, video game, or app that connects to the internet. Without universal privacy protections for all Americans and with the First Amendment, as well as other challenges with internet content moderation, these bills may be difficult to enforce and face legal scrutiny. While they likely won’t alter the internet as we know it, more time is needed to see how these bills will impact platforms and content across the internet.