As misinformation continues to taint discourse around everything from public health to climate change, some Democratic lawmakers have had enough.

Sen. Amy Klobuchar (D-Minn.) has introduced a bill that would remove liability protection from Facebook, Twitter and their ilk if health misinformation continues to spread unabated on their platforms. The bill arrives as anti-vaccination posts proliferate across social media..

“Earlier this year, I called on Facebook and Twitter to remove accounts that are responsible for producing the majority of misinformation about the coronavirus, but we need a long-term solution,” Klobuchar said in a statement. “This legislation will hold online platforms accountable for the spread of health-related misinformation.”

The bill would create an exception to Section 230, part of the Communications Decency Act of 1996, which protects websites from being sued for their content. If it becomes law, the U.S. Department of Health and Human Services would be responsible for creating guidelines on what constitutes health misinformation – and whether platforms would be liable for spreading it.  

“I think it’s a move in the right direction, because right now there’s very little accountability for the industry,” said Darrell West, senior fellow of center tech innovation at the Brookings Institution. “In the United States, if you want to change the behavior, you have to create the financial incentives for companies to move in the right direction. This bill would do that by providing more legal liability for what happens on the website.”

During the pandemic, a wide range of COVID-19 misinformation has circulated online, concerning everything from the virus’ geographic origin to the health risk it poses to the efficacy of unconventional remedies. In recent months, much of the misinformation has focused on the safety and efficacy of vaccines.

“Right now, we’re seeing rising cases owing mainly to people who have refused to take the vaccine because they worry about it not being safe,” West explained. “These people have heard misinformation that the vaccine is not safe – which leads them to not get vaccinated, and now we’re seeing an upsurge in cases. There have been real health consequences.”

As it currently stands, there is almost no government regulation of social media platforms and no legal prohibition on spreading misinformation. Only recently have the platforms voluntarily begun to address the issue.

“Some have human moderators, just entry-level workers who are looking at the site and making their own determination of ‘this looks okay, this does not look okay,’ and they have the power to take things down,” West explained. “But that’s like whack-a-mole. There’s just so much misinformation that humans cannot possibly keep up with it on a case-by-case basis.”

Companies have started to develop algorithms that identify phrases associated with misinformation and automatically remove content. Earlier this month, YouTube announced it would take a more pointed approach to targeting health misinformation with new features that would prioritize authoritative content from hospitals and governmental organizations.

But in some ways, the attempts are too little, too late. President Biden recently commented that social media platforms were “killing people” by not filtering misinformation, urging companies to “do something about… the outrageous information about the vaccine.”

“What concerns people is that we’re 18 months into the pandemic,” West said. “Where have the companies been?”

The social media platforms themselves, not surprisingly, view things differently. In a recent blog post, Facebook’s VP of integrity Guy Rosen argued that the company isn’t to blame for rising COVID-19 cases in the U.S., noting that 85% of Facebook users have been or want to be vaccinated. Facebook also pointed out that, since the start of the pandemic, it has removed more than “18 million instances of COVID-19 misinformation.”

“The fact is that vaccine acceptance among Facebook users in the U.S. has increased,” Rosen wrote. “These and other facts tell a very different story to the one promoted by the administration in recent days.”

Still, West believes the companies need to put in place more rules to stem the flow of misinformation. The misinformers aren’t taking the efforts lying down, either: Some anti-vaccination groups have started communicating in code, changing their names to “Dance Party” or “Dinner Party” to avoid Facebook disciplinary action.

“[The proposed bill] is probably not sufficient by itself,” West said. “The companies have to get much more serious about content moderation and get very aggressive in taking down sites that are spreading misinformation.”

That could include rules that expand the definition of misinformation to include material that runs contrary to accepted medical expertise. If such rules are enacted, it would lead to the removal of content that isn’t in line with peer-reviewed studies, Centers for Disease Control and Prevention guidelines or even widely accepted scientific thought.

A recent report identified that just 12 people, the “Disinformation Dozen,” are behind most of the misleading information about COVID-19 vaccines spreading across social media platforms. Several of the people are anti-vaccine activists or alternative health entrepreneurs.

“We have a pretty good sense of the major people who are doing it, but right now a lot of that information is still online,” West said.

Overall, the latest moves by lawmakers and social media platforms alike signal a change in how misinformation is viewed – and how it should be addressed moving forward.

“It is a shift in mentality, in the sense that we’re seeing misinformation on a wide variety of fronts,” West said. “You see it in the vaccine area in regard to COVID-19. There are climate change deniers and Holocaust deniers, and then there are a lot of false claims being spread about 2020 election fraud.

“The problem goes way beyond COVID, but COVID provides such a dramatic illustration of the health consequences of misinformation. It’s changing the way people think about the subject in general.”