Of all the organizations that find themselves facing scrutiny for their data privacy practices, Google currently occupies the hottest seat. The company isn’t alone: Even as Amazon and IBM face similar high-profile pushback, nearly everyone based in Silicon Valley is still shaking off the stink generated by the Facebook/Cambridge Analytica scandal of 2018.

But Google’s $2.1 billion acquisition of Fitbit has reportedly drawn Department of Justice scrutiny for antitrust considerations. Similarly, its Project Nightingale, based on collecting health records from Ascension, a chain of 2,600 Catholic hospitals based in St. Louis, has earned it a special scolding from the U.S. Senate — not to mention plenty of questions from the U.S. Department of Health and Human Services.

Some critics are even going so far as to herald Google’s moves as the coming of the privacy apocalypse. This has health marketers asking themselves questions: Are they doing enough to assuage worries about their data practices? And are they accounting for the very real possibility that the industry’s image, already less than stellar, is on the line?

“The stakes are super high because privacy problems aren’t what the industry needs right now,” says Crossix Solutions co-founder and CEO Asaf Evenhaim. “It’s important to look at what can be done with data, but it’s just as important to look at what we shouldn’t do.”

The potential for data gaffes is as unlimited as the potential for its effective use. And when things go wrong, regulators, providers and consumers will be as happy to villainize big healthcare as they are to blame big tech.

Source: Getty

“There will be an over-eagerness for people to say the industry has done something just for a commercial advantage, without thinking through the consequences,” Evenhaim explains. 

Google, which declined to comment for this story, bought Fitbit even as it was already under scrutiny for knowing too much about, well, everybody and everything. In a recent interview with The Wall Street Journal, Dr. David Feinberg, recently appointed head of Google Health, sounded a reassuring note: “I came here to make people healthy, I’m not here to sell them ads.”

But because Google makes most of its money selling ads, many aren’t buying it. And now the company will know how often 28 million Fitbit users move, eat and sleep.

“The question becomes not only how to differentiate between the domains of health data, but also how to harness the power of deep learning and responsibly manage the convergence of medical-grade personal health information with real world evidence, such as patient-generated data from wearables including Fitbit, which are not covered by HIPAA. A gray area emerges,” notes Dr. Jennifer McCaney, executive director of UCLA BioDesign.  

Watchdog groups such as the Center for Digital Democracy also have their hackles up: “If this acquisition is approved, Google will… increase its already massive store of consumer data, including highly sensitive health and location information,” the group wrote in a letter to regulators. 

Project Nightingale has unleashed similar concerns. It’s troubling to privacy advocates because the data includes patient names, birthdates, lab results and hospitalization records. Advocates hate that most patients are in the dark, incorrectly believing HIPAA protects their medical records.

Still, many health technologists and marketers are eager to see the industry’s wealth of data be leveraged to create better predictive analytics, better medicines and more efficient care. McCaney believes clinical-trial management has “tremendous potential” and that the success of entrants such as Science 37 and Evidation Health “motivate applications at scale for Google’s new treasure trove of data. Could Google assist with clinical-trial recruitment or patient monitoring, or even become a full-fledged clinical research organization?”

Regardless of what tech companies are doing, it’s up to all players in the medical marketing universe to make sure they’re as mindful of privacy as possible. And that presents a big problem, since many are still hewing to aughts-era privacy guidelines. HIPAA, it’s worth repeating, was passed in 1996, a few years before the birth of Google.

“The rules were created before the digital age,” says Ogilvy Health chief digital officer Ritesh Patel. “HIPAA compliance and standard rules around opt-in have been around for more than a decade, so most life sciences, insurance and health system folks have good regulatory environments and training.”

Ah, but when you toss in data from wearables, connected devices, TVs, computers, phones, tablets and social platforms, the guidelines get vague fast. “What may look like an innocent ability to anonymously track behavior via cookies could blow up quickly as customers navigate across platforms and channels and also throw off potentially personally identifiable signals,” Patel continues. 

To keep up, Patel believes brands and partners have to continually evaluate data-management policies and procedures, as well as the generation of data by new digital tools and platforms. 

“How are the people we work with collecting and managing the data we use? Who has access to what? How are they ensuring the separation of personally identifiable information from behavior data?” Patel asks.

McCaney’s course covers four pillars to consider when thinking about the pitfalls of sharing any health information: security, application, consent and compensation. These pillars are more important than before, given that consumers and HCPs are increasingly aware of what could go wrong. They are better able to distinguish security risks – say, a breach that allows scammers to snatch their personal data for medical identity theft – from threats on their privacy engineered, unintentionally or otherwise, by tech companies. No, Facebook, they haven’t forgotten.

And unlike data pilfered in banking and retail breaches, which can often be “fixed” by mailing out new cards or changing a PIN, medical data exists in a different psychological realm. “Your healthcare records are more personal; they are part of you. Once your medical information, including genetics, is out there, it’s out there to stay,” Evenhaim says.

Let’s not forget that consumers themselves are evolving on privacy questions. While younger people can come across as perplexingly cavalier about the sharing of personal information, the rise of companies such as Hu-Manity.co, which creates tools to take control of data, suggests consumers are keenly aware of how that information can be used against them, by insurance companies and other health and wellness players. Sooner or later, they’re likely going to ask to be paid for all that data they have freely (and in many cases unknowingly) coughed up.

“They see their personal profile and activity as a badge of honor and are publicly sharing things that some would find amazing,” Patel says. “But they are becoming aware of the value of privacy and their data.”