Health misinformation was a scourge for many years prior to the widespread adoption of social media. But when Facebook, YouTube, Twitter and the like burrowed themselves into our collective consciousness, the impact of such misinformation became supercharged. Never before had there been such a regrettably perfect environment for unfounded theories to sprout and spread. 

Even in that context, 2020 was a challenging year, with the perennial issue of health misinformation taking on a new, furious urgency. On social media, conspiracy theories were minted and disseminated with abandon, aided by leaders who not only trafficked in misinformation but actively promoted it on national television. President Trump may be out of office, but the miasma persists.

“The concept of misinformation has become front and center,” says Publicis Health Media president Andrea Palmer.

The challenge is intensified amid a climate in which a significant percentage of Americans are distrustful of publications, healthcare systems and institutions in general — and it creates an especially acute problem for health publishers. The stakes are high: The ability to convey accurate information can, quite literally, be a matter of life and death. 

As vaccination efforts accelerate and the country looks ahead to a return to something approximating normalcy, misinformation around vaccines are top-of-mind for health professionals and publishers alike. The effort, alas, can be an uphill battle.

“There are so many storylines out there,” Palmer notes, pointing to inaccurate views on everything from herd immunity to vaccine safety and efficacy to the financial incentives behind vaccination efforts.

Misinformation tweets from 2020

A nonprofit that reports on the accuracy of statements made by political figures as well as broader misinformation, PolitiFact has invested significant time and energy into fact-checking news and theories around the vaccine, according to Angie Drobnic Holan, the site’s editor-in-chief. In recent weeks, this has included claims made on cable news that the pandemic is over, that life can go back to normal and that there’s no need to wear masks anymore. There have also been myriad conspiracy theories — for instance, that if you get the vaccine, you will be auto-enrolled in a pharma medical trial. The worst likely isn’t behind us: As distribution accelerates, new misinformation will continue to emerge. 

Disinformation around the vaccination effort is part of a wider trend, in which competing theories and unverified claims run rampant. Thanks to social media, the barriers to reaching a large audience have largely dissolved. In many ways, this is a good thing: Under-represented voices and topics have a chance to flourish and find a wider audience.

But the removal of gatekeepers can also amp up the noise to the point where legitimate signals are hard to detect and verify. 

“Anyone can post a blog and be viewed by some people as an expert, and anyone can have a YouTube channel,” notes Dr. John Whyte, chief medical officer of WebMD. And because social platforms reward engagement above virtually all else, he adds, “The most provocative voices get optimized in search.”

Actress and model Jenny McCarthy, who started promoting an anti-vaccination agenda and conspiracy theories in the late 2000s, was an early example of a non-medical personality using her profile and influence to promote health misinformation. “It has ballooned ever since then,” Whyte says. 

Meanwhile, changes to social media companies’ algorithms poured gasoline on a raging fire. Over the course of a career in various areas of healthcare and medical research, Palmer has consistently encountered unfounded theories that exist despite extensive, opposing clinical research and even medical consensus. And thanks to online networks that reward engagement, these belief systems are set up for further expansion. 

When PolitiFact was founded in 2007, it fact-checked statements made by pundits and politicians. In late 2016, it had broadened its mission to cover more general, and often anonymous, theories that were taking off online. By that point, social media platforms such as Facebook and YouTube had codified virality into a feature.

Misinformation is both sticky and alluring, particularly when it comes to health. Unverified medical claims offer false sense of hope and clear-cut answers that can’t be found in peer-reviewed scientific literature. A fast, easy online search reveals an abundance of supplements that purport to cure virtually any ailment, from cancer to obesity.

“The issue with all of this is they are unproven therapies,” Whyte says. 

Dr. Ivan Oransky, editor-in-chief of Spectrum and a professor at New York University’s Arthur Carter Journalism Institute, agrees, adding, “There’s that saying, ‘A lie gets halfway around the world before the truth puts its boots on.’”

Oransky notes that the sentiment was first expressed before the wide-scale adoption of social media. Today, of course, “That lie probably has circled the Earth several times before truth puts its boots on, because everybody has a platform now. The algorithms are built to expressly amplify the lie and make it reverberate,” he continues.

On a more basic level, drama and novelty sell. Before joining WebMD, Whyte spent a decade as the chief medical expert at Discovery Channel. The health shows that drew the highest ratings featured rare conditions: stories about hundred-pound tumors, women who didn’t know they were pregnant until giving birth and chimerism. Whyte still remembers these segments as both shocking and compelling.

However, whether they delivered important health information remains an open question. For a more straightforward show on, say, diabetes, attracting eyeballs is “going to be more difficult,” Whyte acknowledges. 

Compared to dramatic theories that play into existing belief systems, real information is inherently more complicated, according to Dr. Amit Phull, medical director and VP of strategy and insights at Doximity. “It’s less entertaining and more difficult to remember,” he explains.

That doesn’t mean, of course, that verified health data can’t be digestible or compelling. “You can use the same mechanisms that supercharge misinformation and actually employ those tactics to get real information across to people,” Phull adds. 

With that in mind, here are some of the tactics that publishers have deployed to push back the rising tide of health misinformation. 

  • Make it accurate. This should go without saying, but the first tenet of combating misinformation is to refuse to traffic in it. For health publishers, standards should be higher: They must commit themselves to producing content that is well-sourced, verified and clear. 

    Doximity, for example, has an editorial team that works with medical fellows and other specialists to curate and vet content that goes up on the site or is distributed through its network. It also reviews user submissions from Doximity’s network of healthcare professionals. In addition to providing feedback and guidance, the team blocks content that doesn’t provide sufficient sourcing or is rife with conflicts of interest. 

    Verywell Health, a site focused on health and wellness, has a medical review board and employs a chief medical officer, according to SVP and GM Rob Parisi. “If content isn’t accurate in the health space, then we have nothing,” he says plainly.
  • Make it digestible. Medical and health data often comes packaged in lengthy, complicated studies and reviews. For lay audiences, such formats are more likely to confuse than enlighten. Not surprisingly, health publishers are finding success translating complex topics into digestible, bite-size pieces of content.  

    “Short videos, just a couple minutes in length, are very successful nowadays,” Whyte says. Other accessible formats include slideshows, newsletters, Q&As and blog posts. 

    Phull agrees, stressing that “being concise has never been more important.” He believes a significant part of Doximity’s value lies in its ability to translate dense content into formats that are engaging and accessible. Internally, in fact, employees joke that Doximity’s content is “like Tasty videos for medicine.”

    The company’s approach focuses on concise blog posts and short-form videos, two formats that work well in combination. Watching a video helps viewers bookmark information they can later reference in text form, and vice versa. “That one-two punch is actually very powerful,” Phull says. 

    Beyond its extensively researched content, PolitiFact deploys a truth-o-meter to accompany each article. The simple graphic, which displays a ticker resting on a scale that ranges from false to half-true to true, is a visually catchy way of conveying nuanced information. It also leans on dramatic imagery to quickly communicate with its audience.
  • Make it customizable. Just as bite-size, accessible pieces of content are important, so too are longer, more comprehensive resources. A varied approach, employing pieces of differing lengths, levels of expertise and detail, can complement one another, allowing users to choose how in-depth they’d like to go.

    At Doximity, lengthy papers and articles are distilled into succinct summaries and emailed to providers within the network. The reason? Medical information can be dense and users are busy. These top-line overviews, which often take the form of an infographic or video, don’t replace in-depth reporting or research. Instead, they offer a potential entrypoint for a later deep dive.

    “We refer to it internally as layering,” Phull notes. Shortened summaries allow physicians to efficiently sift through, flag and return to the clinical research and reporting that interests them. 

    Somewhat counterintuitively, repetition is a valuable tool. “You have to have a variety of messaging, and you have to repeat messaging,” Whyte notes. This holds double for conditions such as the coronavirus or obesity, where scientific understanding has evolved quite quickly. Multiple resources build on each other, creating a layered body of content that users can engage with on their own timeframe. 

    PolitiFact’s truth-o-meter is a simple but effective example of this. “We wanted something people could read on different levels,” Holan says. If someone just wants to know whether a theory is true or not, a glance at the graphic will tell them. If another individual wants a more in-depth, nuanced look at the veracity of a given statement, she can turn to the full article.
  • Make it personal. Another reason misinformation and conspiracy theories are so compelling: They tap into the personal. By leading with anecdotes and emotion, they feel specific in a way peer-reviewed studies and reported pieces rarely do. At the same time, many experts believe empathy and personal connection can be just as effective at disseminating accurate, responsible health messaging and combating misinformation. 

    Take Verywell Health, which strives to present credible health information with “an empathetic tone that doesn’t read like it’s coming from a medical textbook,” Parisi says. The site’s vibe and style is informed, but not formal; it strives for clarity and accessibility, without compromising on accuracy.

    At the end of most Verywell articles, a section called “A Word from Verywell” offers insights and takeaways from the content that precedes it. The format provides a way to “step back and acknowledge we are talking to a person,” Parisi notes, often by providing tangible next steps “instead of simply presenting the information to them and having them fend for themselves.”

    For Phull, a physician who also works in emergency medicine, empathy is one of the most effective ways of combating misinformation with patients. “I’ve had the best success in my own career by just leveling with people,” he explains.

    To that end, Phull doesn’t judge or shame anyone based on information or beliefs. Instead, he tries to listen before offering a (presumably more factual) counterpoint. It’s an approach based on treating patients as human beings, a simple but fundamental act.

    It’s just as important that patients similarly view healthcare workers as human beings. To communicate the importance of getting the COVID-19 vaccine, not to mention potential side effects, Whyte took to Instagram Live to document his experience receiving both doses.

    “I talked about my symptoms. I did have a headache, I did have chills, I was very tired for a day or two,” he recalls. “But I was transparent.”

    The way Whyte sees it, providing patients with relevant scientific information in a human manner is perhaps the best antidote to misinformation. By focusing on the personal and refusing to condescend, such messages prioritize empathy.“It’s a very effective approach,” Whyte adds.