Though you might not know it from the spotty coverage of such crises, pharma and healthcare organizations have endured an abundance of data privacy headaches in recent years.

Yes, Merck was publicly tsk-tsked after bad actors successfully pierced its defenses in June 2017. But how about health system UnityPoint Health, which was hit twice in 2018 alone, to the tune of the potential exposure of 1.4 million patient records? Even the supposedly impregnable Healthcare.gov found its records compromised, with the Centers for Medicare and Medicaid Services acknowledging an estimated 75,000 personal files were surreptitiously accessed this past October.

Yet somehow a widespread perception still exists that pharma and healthcare have dodged most of the data-privacy bullets fired in its general direction. Sure, industry wags reason, health organizations have dealt with their share of encounters with hackers and phishers and other digital miscreants, but nothing on the scale of what Experian or Marriott endured. Perhaps that’s what prompted one exec to quip, when asked about his level of confidence in the data privacy and security practices of his colleagues, “We’re fine unless somebody blows it — in which case we’re [in trouble].”

That statement alternately bewilders and horrifies data mongers across the industry, less because they believe it contains a kernel of truth than because it seems to minimize the seriousness with which they are attempting to address all such concerns.

“It’s not a question of whether the industry is thinking about [data and privacy], because it is,” says Carlos Rodarte, founder and managing director of digital health consultancy Volar Health. “My worry is that there’s no real agreement in how we’re thinking about it. Is it about data ownership or consent? Is this a technology problem or is it a values one?”

These questions, he adds, aren’t ones the industry can afford to brush aside. “In healthcare and life sciences research, there’s a natural model toward progress and learning and more education. But how do you get there? Through more and more data.”

We’re fine unless somebody blows its — in which case we’re [in trouble]

an undisclosed healthcare executive

Pharma and its partners have their eyes wide open to the threat posed by the potential mishandling of sensitive patient information. They’re also aware just how thin a line they’ve been walking over the years.

“The moment ‘enter’ has been clicked in an EHR, your data has been monetized by others,” explains Craig Lipset, head of clinical innovation in Pfizer’s global product development group and an adviser to Hu-manity.org, an organization that champions continuous consent and choice around data-sharing. “That’s what makes this so tricky. The No. 1 source of bankruptcy is health-related debt, so you have people thrown into bankruptcy and on the back end of that [their personal data] is being monetized by others? That’s hard to swallow.”

This may be why there’s an increasing push to realign incentives around data sharing among patients.The thinking goes something like this: If patients somehow share in the value of their data, they’ll be more inclined to share the data itself — or more OK with the sharing that takes place without their explicit permission.

What form this might take is anyone’s guess. Also, in health it’s never as simple as asking patients to check a box and calling it an afternoon. As Rodarte puts it, “Expecting everyone to own his or her data and say, ‘Here’s what I want you to do with it or not do with it’ — I’m not sure that’s going to work.”

A culture around privacy?

None of this is to say the industry is on the precipice of a crisis. In healthcare, more than in just about any other vertical, there’s a long-established culture around privacy. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) set rules for data security and privacy well before most other industries. “Before big data was a thing, HIPAA existed. Facebook and Google — and really, almost everybody else — hasn’t grown up in a world of regulation,” says Kevin Troyanos, SVP, analytics and data science at Saatchi & Saatchi Wellness.

However, that culture may not be as entrenched as some of its boosters believe. The world’s biggest pharma companies have tightened up their data practices, especially after seeing how Merck was dragged over the coals following its breach. Most A-list EHR providers, hospital systems, and insurers are similarly buttoned up.

The problem is many smaller players haven’t followed suit. Data privacy advocates express concern bordering on hysteria about the practices of app makers who have ingratiated themselves with health and tech systems.

“There’s no great way for industry to evaluate and hold solution providers accountable for how [data] is being used,” says Asaf Evenhaim, cofounder and CEO of pharma data/analytics supplier Crossix. “The way for the ecosystem to protect itself is for everybody to act, not just say, ‘Hey, we’re HIPAA-compliant, we’re safe.’ Without a strong foundation in place, there could be huge blowback.”

Iyiola Obayomi, senior director, marketing analytics at Ogilvy Health, agrees, adding, “Sometimes you wonder if everybody is paying enough attention to evaluating all the different partnerships they have. At times, some organizations might pass on the responsibility [for ensuring data security and privacy] to third-party partners.”

This, in turn, could spur a larger crisis. Talk to any tech or marketing exec within the health ecosystem and you’ll hear about a quenchless thirst for super-smart AI use cases. It’s not an overstatement to say the industry’s top craving is for better predictive analytics.

Sometimes you wonder if everybody is paying enough attention to evaluating all the different partnerships they have

Iyiola Obayomi, Ogilvy Health

Ah, but those analytics are fueled by bytes upon bytes of data. So let’s say a high-profile privacy breach in the healthcare realm prompts thousands of consumers to opt out of any/all sharing of their personal information — to the extent health organizations allow them to do so, anyway. Should the data stream stop flowing, the analytic tools upon which industry is increasingly relying won’t work as well.

“Everybody makes the assumption the data is there and that it always will be, but it’s a mistake to assume the data supply chain is sustainable,” Lipset explains. “When there’s increased transparency around how data is being brokered and sold and how patients aren’t being asked to participate in it, that will make the chain unstable.”

Then there are concerns about the potential reidentification of deidentified data. HIPAA may well have established a baseline of sorts, in that it required the scrubbing of just about every morsel of information that could be linked back to a specific individual. As a result of those constraints — and the fact they were enacted in 1996, several years before your mom or dad opened an AOL account — healthcare started the big-data era with a big security/privacy lead over most other industries.

That advantage may not survive today’s all-data-all-the-time information landscape.

“I don’t know if we, as an industry, understand that [data] reidentification isn’t that hard when you connect enough information streams together,” Troyanos says. “When you leverage location data and apply some machine learning on top of that, it becomes like a trail of crumbs that leads you back to reidentification.”

Regulation and governance

While experts seem split on the question of whether data privacy concerns in healthcare will intensify, good luck finding anyone who believes they’ll evaporate. While California’s new privacy law isn’t as far-reaching as Europe’s General Data Protection Regulation (GDPR) scheme, it will still change the game for anyone who operates in North America when it kicks in next January. If you’re looking for a reason that many health-adjacent organizations are scrambling to modernize their data practices, look no further.

That’s likely to continue throughout 2019. “We should all assume GDPR is a global policy, because, frankly, it’s cheaper to act that way,” Troyanos explains. “If one state has data regulation that’s more restrictive than every other state’s regulation, you can either create two separate [data] policies or you can be more conservative overall. Maybe it makes sense to be three or four steps more conservative than what the most restrictive governing law says you have to be.”

Lipset puts it even more succinctly: “California ain’t Wyoming. That’s a pretty big chunk of the population you’d have to write off right out of the gate.”

One way health organizations might choose to get ahead of the curve is by taking concrete action to get their privacy and security houses in order, which necessarily includes pressing partners with access to its data to do the same. If those partners can’t provide verifiable assurances, it might be time to get new partners.

Largely due to the aura afforded by HIPAA compliance, health-adjacent companies have assumed a degree of inoculation from such threats. But in today’s data climate, that kind of thinking will get a company hacked.

“Because pharma companies are benefiting from the analytical value of being able to communicate with patients, they have a responsibility to be a stakeholder in the privacy discussion and get more educated about the risks,” Evenhaim notes. “HIPAA compliance is one thing. People’s privacy thresholds are another.”

I don’t know if we, as an industry, understand that [data] reidentification isn’t that hard when you connect enough information streams together

Kevin Troyanos, Saatchi & Saatchi Wellness

In response, look for organizations to develop data governance as an internal discipline, much as they have data science and others. “Every industry, not just health, needs to be taking a closer look at what data governance really means,” Troyanos says.“There has been so much growth in data science, analytics, and engineering roles, but there hasn’t been that same level of growth in data governance.”

Health organizations will also have to do a better job educating their own people, especially those who might not be as acquainted with the intricacies of permissioning data. “If the industry wants to really benefit from all this data, it needs to elevate privacy in a way that it can be consistently evaluated,” Evenhaim explains.“Privacy is complex and sometimes very technical. A marketing person or brand manager might get excited about a particular capability, but they’re not [data] experts and they shouldn’t be asked to make those decisions.”

Rodarte agrees. “There’s still some capability-building that’s needed. Right now, everyone’s in the mindset of ‘Give me all your data and we’ll figure out the details later.’ That’s potentially dangerous.”

By way of example, he points to a wide range of roles within pharma — global medical affairs, development, marketing and sales — and stresses how goals and motivations vary wildly among them. “It’s hard to make blanket assumptions about data when you’re comparing a marketer to a genomics researcher. You can’t really have a textured discussion about privacy without asking, ‘Well, who’s on the other side of it?’”

Fair trade data?

Finally, data wonks in the healthcare world might look for inspiration from — big coffee? Let Lipset explain. “If you were a coffee drinker 10 years ago, you had no choice but to exploit some worker somewhere in Central America because of the nature of the coffee-bean supply chain. Were you a bad person? Of course not. Fast forward to 2019, when you have a choice as a consumer to buy coffee with the fair-trade label, which means the supply chain respects workers’ rights.

“I think we’ll see the same thing with data. As an industry, we’ve been buying it and relying on it for years. Are we bad for doing this? No — again, there’s no real choice otherwise. But if one of our options is similar to fair trade — data sourced from patients who explicitly permissioned that data — I’m going to go with the fair trade one.”

If there’s reason for optimism, it’s that pharma and health institutions appear to be on the same page when it comes to the pressing need for a privacy fix. And while it’s equally naive and simplistic to assign a default attitude to a mammoth, sprawling business such as healthcare, nearly every player’s intentions appear to be in the proverbial right place.

Right now, everyone’s in the mindset of ‘Give me all your data and we’ll figure out the details later.’ That’s potentially dangerous.

Carlos Rodarte, Volar Health

“The industry has a legacy of using data in a way that’s deidentified and respects privacy — that’s not the issue here,” Evenhaim explains. “Frankly, it’s more about how the world has changed. This isn’t just about healthcare. So much of this is being driven by what’s happening outside healthcare.”

Promising recent signs include industry support for organizations such as Vivli, a nonprofit that facilitates the sharing of individual participation-level data from concluded clinical trials, bolstered by rich privacy and security capabilities. Indeed, most data professionals, especially those who work closely with big pharma or ply their trade within its walls, believe the industry has an opportunity to get on the right side of history, so to speak. By acting now, they say, pharma and healthcare organizations will look like good actors several years down the road.

“[Companies] don’t lose or win if the ownership of data goes to patients or if it goes to the data aggregators. Either way, they’re paying for the data,” Lipset says. “Wouldn’t they rather be a part of creating that fair trade data supply chain? There will be a halo effect of being part of that change.”

Rodarte, for his part, expects many health and pharma organizations to embrace further regulation around data and privacy, as against-type as it sounds.

“The companies genuinely trying to do good are going to push for regulation in this space,” he says. “They have nothing to hide, so why wouldn’t they attempt to differentiate themselves on values?”

Back to the 2019 Data Week Issue