Most public health experts agree that unfettered social media access can be bad for kids’ mental health. 

Now, Congress is taking steps to do something about it — while also opening up a debate over whether such action would infringe upon free speech.

This week, the Senate passed two bills that would regulate social media usage among children in a bid to protect their privacy, safety and mental health.

The chamber voted 91-3 in favor of the bills, with only Sens. Ron Wyden, (D-Ore.), Rand Paul, (R-Ky.), and Mike Lee, (R-Utah), voting against them.

It’s the most notable legislation around youth online safety to advance since the late 1990s — and a rare example of an issue that has garnered a strong amount of bipartisan support during a contentious election year.

“Our kids have been waiting too long for the safety and privacy protections they deserve and which this bill would provide. This is more important than ever with the growing use of AI,” President Joe Biden said in a statement after the bills were passed. 

Harms of social media

The conversation around the youth mental health crisis — and its link to social media and smartphones — has been slowly building since the COVID-19 pandemic, when depression and anxiety rates among young people skyrocketed.

In December 2021, Surgeon General Vivek Murthy announced a public advisory on the youth mental health crisis

These advisories have historically been reserved for significant public health problems that require “the nation’s immediate awareness and action,” the Surgeon General’s report at the time noted.

Since then, Murthy has been vocal about the harms of social media on youth mental health and accumulating research has backed up his concerns. 

A recent report from SSCG Media Group found that 80% of pediatricians surveyed cited the rising use of smartphones and social media among kids as a significant risk factor for mental health issues.

KOSA, explained

One of the bills that was passed — the Kids Online Safety Act (KOSA) — was originally introduced in 2022.

The legislation seeks to protect children from some of the mental health risks associated with social media use. 

This includes exposure to content that amplifies eating disorders or suicide ideation, as well as the addictive features of many platforms that are marked by “doomscrolling,” algorithmic feeds and autoplay.

At its essence, KOSA would create a “duty of care” for tech companies to prevent harm to minors. This would take protect them from content that entails bullying, sexual exploitation or violence as well as content that promotes substance abuse, eating disorders or suicide.

KOSA would also require social media platforms to provide more options to minors to protect their data privacy, and disable features that have been linked to social media addiction.

Specifically, this relates to aspects of social media that “increase, sustain, or extend the use” of the platform. That could mean removing algorithm-driven suggestions, videos on autoplay or platform rewards for minors.

Finally, KOSA would allow independent researchers to conduct studies on how social media platforms are affecting youth mental health.

Under the proposed law, the Federal Trade Commission (FTC) would be tasked to monitor social media companies that don’t comply with the content requirements.

The bill has received support from various health organizations, including the American Psychological Association (APA), which sent a letter to lawmakers last year urging Congress to move the legislation forward.

“[KOSA] takes important steps toward curtailing the harms posed to youth by social media use and content, while seeking to retain the benefits,” APA CEO Arthur Evans said in a statement. “The legislation also creates important new access for psychological researchers to data held by social media companies that is essential to further understanding the platforms’ impact on children.”

COPPA 2.0

The other bill — the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) — would update the COPPA law that created privacy protections for kids online in 1998. 

COPPA 2.0 would raise the maximum age of kids covered under that original law to 17, meaning tech companies wouldn’t be able to collect data on children under that age without their consent.

Given that technology has advanced in significant ways since 1998, COPPA 2.0 would also require the law’s definition of personal information to include tech like facial recognition, fingerprints and voiceprints.

Even though the law is in need of revisions, violating COPPA has its consequences.

In 2019, the FTC cracked down on the Musical.ly app — now TikTok — for violating the original COPPA, alleging that the platform illegally harvested personal information from kids under the age of 13. The agency charged that the platform also didn’t safeguard the data from third parties. 

The company ultimately settled with the FTC for $5.7 million.

COPPA 2.0 might also have an effect on advertisers, too. In particular, it would ban third parties from targeted advertising — which relies on the use of personalized data — to users under the age of 17.

While various nonprofits advocated for the provision, some advertising industry players have voiced opposition. 

The Interactive Advertising Bureau (IAB), an industry trade group, has been opposed to the bill for two years, claiming in a statement that other companies would erect barriers to prevent children from accessing “any content at all.”

“Policymakers need to understand that digital advertising subsidizes safe, free content helping kids to learn, play and communicate,” Lartease Tiffith, executive vice president for public policy at IAB, said in a statement.

Tiffith added that small businesses are unprepared and that Americans would encounter a “much less user-friendly internet.”

Controversy over the bills

Still, the controversy over the bills hasn’t stopped there. The bills have also sparked a much bigger conversation around free speech and whether the legislation would allow for the censorship of content spanning from LGBTQIA+ issues to abortion.

The American Civil Liberties Union (ACLU) has been one of the loudest opponents to KOSA, arguing it would violate the First Amendment by allowing the government to determine what information people can access online.

“KOSA compounds nationwide attacks on young peoples’ right to learn and access information, on and offline,” said Jenna Leventoff, senior policy counsel at the ACLU, in a statement.

Among some of the ACLU’s concerns are crackdowns on content about transgender-affirming care or reproductive rights if a conservative-leaning FTC chooses to view that as “harmful” content to minors. 

Meanwhile, on the other side of the aisle, Republican opponents fear KOSA could result in social media companies restricting anti-abortion or pro-life content.

“We live on the internet, and we are afraid that important information we’ve accessed all our lives will no longer be available,” Anjali Verma, a 17-year-old high schooler who is part of an ACLU student lobbying campaign, told The New York Times.

For now, however, the bills are in flux until the House votes on it, which is not likely until the fall, as the House just started its summer recess. 

Still, Biden has said he would sign the bills if they pass in the House, noting that the current regulations on online safety for kids were “insufficient” to address the youth mental health crisis.