Facebook, YouTube and Netflix have all experienced their share of controversies over the oft-misinterpreted notion of free speech, especially in the context of health and vaccine misinformation. Now it’s Spotify’s turn.

Last weekend, musician Neil Young announced he would pull his music from Spotify in protest of the platform’s support of Joe Rogan, who has been criticized for stoking vaccine skepticism on his hugely popular podcast. Since then, several other musicians, including Joni Mitchell and India Arie, have followed suit.

The controversy has reopened questions about the role major media platforms should play in combating health misinformation — and whether they should do more to stem the spread of inaccurate or misleading content.

In response to the Rogan backlash, Spotify published new platform rules around misinformation. Content subject to removal includes claims that COVID-19 isn’t real or that vaccines were designed to cause death, as well as content encouraging people to get infected with the virus to build immunity.

Jonathan Brady, group engagement director at FCB Health New York, characterized the rules as a first step in the right direction. He emphasized, however, that flagging misinformation only goes so far.

“Putting that policy out there is a good step,” Brady noted. “But there’s still the problem Spotify hasn’t solved, which is that the interview is out there. Rogan has a history of pushing not necessarily misinformation, but skepticism and confusion. That can be just as deadly in a public health crisis.”

The problem has caught the attention of the federal government, with President Biden calling out Facebook for “killing people” with COVID-19 misinformation. Democrats have pushed for legislation that would remove liability protections from tech companies complicit in the spread of inaccurate health information. In September, YouTube bolstered its pushback against inaccurate vaccine-related content by banning the channels of well-known misinformation spreaders.

But Spotify finds itself under fire for a different reason altogether: Rogan’s podcast is Spotify content, not user-generated material. The company’s $100 million licensing deal with Rogan makes him the platform’s highest-paid contributor. The podcast itself ranked as the most popular podcast on Spotify in 2021.

“The distance between the platform and the person is compressed, because there’s a highly paid relationship there,” Brady noted. “Is the disclaimer sufficient? Not really, because in this instance you can create a policy that allows a loophole that’s sufficiently large for your most highly paid talent to walk through.”

The podcast episode that sparked the backlash featured Robert Malone, a vaccine scientist who has been deemed a major spreader of vaccine misinformation — to the extent that he has been banned from Twitter. Rogan responded with a 10-minute video streamed on Spotify, during which he admitted he could do more to add balance to his episodes. 

“I don’t know what else I can do differently, other than maybe try harder to get people with differing opinions on right afterwards,” Rogan said in the video. “And do my best to make sure I’ve researched these topics, the controversial ones in particular, and have all the pertinent facts at hand before I discuss them.”

“I’m not trying to promote misinformation. I’m not trying to be controversial,” he added. “I’ve never tried to do anything with this podcast other than just talk to people and have interesting conversations.”

That’s one of Rogan’s frequent lines, and an oft-used line of reasoning by proponents of free speech. But Brady said that Rogan — and Spotify — have a higher responsibility to stem conversations with a potentially dangerous intent.

“The nature of human conversation and idea-generation is basically infinite,” Brady explained. “If someone’s intent is to promulgate racist beliefs, you can say, ‘Don’t say these 10 things.’ But language is malleable enough that you can find your way there.”

“At the end of the day, Spotify needs to protect its brand and satisfy its customers that it aligns with their core values, and take an editorial approach as well as a policy one,” he continued. “That’s the only way they have a long-term path out of this mess.”

Bre Thomlison, a managing director on Real Chemistry’s integrated media team, agreed that Spotify needs to prioritize brand safety.

“We’re still at a point where brands have control over the narrative,” she said. “You can compare this to gun issues, where you’ve seen Walmart stand up. It’s just a matter of time until digital platforms are going to be doing the same thing for health-related areas.”

Thomlison noted that Spotify could take any number of actions to stanch the flow of misinformation, such as hiring fact-checkers. Disclaimers about vaccine skepticism could also be noted by show hosts themselves.

“As marketers we’re required to put disclaimers on all our content,” Thomlison said. “Joe Rogan should say, ‘This is for entertainment purposes,’ because he’s very influential.”

If Spotify chooses not to impose more editorial control over its podcast hosts and content creators, however, the company could attempt to amplify the voices of more public health leaders, said WebMD chief medical officer Dr. John Whyte. This could be paired with a focus on helping credible health leaders communicate more effectively – something the Centers for Disease Control and Prevention, for one, has struggled to do during the pandemic.

“Wouldn’t it be great for Spotify to help these public health leaders become better communicators?” Whyte said. “They could create a podcast with certain experts, like the Surgeon General, or help the CDC or the Department of Health and Human Services create a podcast. That would go a long way toward addressing misinformation.”

“The approach should be more about Spotify and others trying to give voice to experts, and helping them understand how to be successful in communicating to people in that space,” Whyte added. “A disclaimer alone is not enough.”