Amid mounting government scrutiny on the way they share sensitive information with third parties, health-related websites are facing pressure to firm up their data-handling policies.

That pressure intensified last week when the Federal Trade Commission took action against BetterHelp. The online mental health counseling service was disciplined for claims that it shared consumer health data it had promised to keep private with companies including Facebook and Snapchat.

As part of a settlement, BetterHelp will pay $7.8 million, from which the FTC will partially refund payments to customers who used the service between August 2017 and December 2020. BetterHelp, which admitted no wrongdoing, will also be banned from sharing consumer data in the future.

The decision comes about a month after the FTC reached a first-of-its-kind settlement with telehealth and drug coupon provider GoodRx. The accusations against BetterHelp looked quite similar to the ones FTC alleged against GoodRx — namely, that the latter shared patient data with third parties like Google and Facebook for advertising. GoodRx agreed on a $1.5 million fine and remedies, although it never admitted wrongdoing.

The FTC’s one-two compliance punch displays an “aggressiveness” that should make other digital health companies take note, observed attorneys from the law firm WilmerHale in a recent blog post. Such organizations must make sure to obtain appropriate consents for uses of consumer information and that privacy policies and privacy-related public statements accurately describe their data use practices, the WilmerHale team advised.

In other words, the takeaway from the two cases is that companies should “have a program in place to ensure their practices live up to their promises,” to quote the agency’s own guidance

Doing so, however, will require a wholesale industry revamp, because the digital health industry commonly shares data improperly. A December 2022 investigation by The Markup and STAT found that virtual care websites like Hims & Hers, Ro and Thirty Madison were leaking sensitive medical information to Facebook, Google, TikTok and other advertising platforms. 

Of the 50 digital health websites studied, 49 used trackers to capture URLs users had visited. Thirty-five shared personal information (full name, email addresses, phone numbers and the like), while 13 shared users’ answers to questionnaires. 

The companies’ leaky data practices technically aren’t violating any privacy standards. As experts have pointed out, the Health Insurance Portability and Accountability Act (HIPAA) was not built for telehealth. 

Startups like Cerebral, Lemonaid, Nurx and Talkspace typically act as connectors between patients and HIPAA-covered providers with whom the sites are affiliated. In turn, data gathered during a telehealth firm’s intake may not be shielded by HIPAA, even though the same data shared with the healthcare provider would be.

That has left such companies to operate in a legal and ethical gray zone. But as the Markup/STAT piece points out, patients often assume their health information is protected by privacy regulations. Many of the same telehealth sites’ intake forms promise users that their software is HIPAA-compliant and that any information shared is kept private.

Thus patients often unknowingly put their sensitive health data at risk when they enter it on a personal device or software-based health-tracking (mHealth) app. That information is sold to various players. In addition to social media platforms, which leverage it to target ads, health data may wind up in the hands of data brokers.

While the extent of the relationships between data brokers and mHealth app companies is still hazy, a February study by Duke University’s Technology Policy Lab examined these ties, particularly with regard to the sale of mental health data. According to the study, mHealth data circulates within the data broker industry “in large quantities, with either vague or entirely nonexistent privacy protections.” 

According to the research, brokers were trading patients’ names and addresses, specific sensitive diagnoses (including ADHD and Alzheimer’s disease) and relevant prescribed medications. As a result, the author called for a comprehensive federal privacy law — or, “at the very least, an expansion of HIPAA’s privacy protections alongside bans on the sale of mental health data on the open market.”

As indicated by its recent enforcement activity, the FTC isn’t standing on ceremony. Its lawsuit against data broker Kochava for collecting and selling of health services location data, filed last year, may be allowed to proceed, a federal judge ruled last month. With GoodRx and BetterHelp, the agency signaled a willingness to pursue non-HIPAA-covered entities for the sharing of identifiable, sensitive health data. 

The FTC is starting to “cut off the inappropriate free flow of health data at its source,” said Shawn Flaherty, director of partnerships at software firm Tranquil Data, which helps companies use and disclose data in line with compliance requirements. 

The regulator is also creating what Flaherty and others see as a new compliance mandate. As the proposed consent orders in the BetterHelp and GoodRx matters stipulate, the companies must obtain “affirmative express consent” before disclosing personal information. That is, the sites must allow for users to indicate their wishes in unambiguous fashion.

There is evidence the digital health ecosystem is starting to level up. Some of the telehealth sites mentioned in the Markup/STAT piece have since updated their privacy policies, including online therapy and medication firm Brightside Health (as of February 1) and Brightline, which offers virtual behavioral and mental health care (as of January 1). Thirty Madison, the digital prescription platform which recently merged with Nurx, refreshed its data-sharing terms as of December 1.

Language in most health sites’ privacy policies has been purposely vague when it comes to handling patient data or characterizing the kind of marketing that’s been occurring. That’s not only due to the risk-averse nature of legal departments: Previously, a company may only have gotten in hot water if it lied, so vagueness has been a safeguard.

But listing precisely with whom they’re sharing data and for what purposes becomes harder to control for large organizations that have millions of users and a lot of data coming in and out. Manually handling such processes (for instance, relying on someone to share the data in line with privacy requirements) is a recipe for failure.

Indeed, when confronted with the disconnect between what their consumer-facing privacy statements claim and what their ad technology is actually doing, even the privacy officers at these companies seem not to fully get it. As the Markup/STAT piece explained, “Marketing teams at these companies don’t fully understand privacy regulations, and legal teams don’t have a handle on how the marketing tools work.” 

In BetterHelp’s case, according to the settlement, the company was sending data to Meta, Snapchat, Pinterest and/or Criteo that may have included users’ “hashed” email address, which the social media companies subsequently used to identify users. From 2018 to 2020, BetterHelp’s marketing department was using these email addresses and the fact that the individuals had previously been in therapy “to instruct Facebook to identify similar consumers and target them with advertisements.”

To prevent such lapses, data compliance must be automated, which would require a big technology investment. Moreover, updating terms and conditions is just a first step. Per BetterHelp’s FTC settlement, to avoid scrutiny companies must now disclose what data is shared, with whom (by name) and for what specific purposes. It must do so via an actual opt-in “apart from” any privacy policy or terms of service. 

In other words, the site’s disclosure would need to occur “in some sort of consent flow during onboarding…so that a user can say, ‘OK, I’m alright with that,'” Flaherty explained.

Doing so at scale presents a real problem. Companies typically share data with dozens if not hundreds of third parties for a broad set of purposes. Facebook estimated in a leaked internal email that it would take them 650 “engineering years” to fix this issue and “have an adequate level of control and explainability over how our systems use data.” 

If privacy lawyers are wondering whether affirmative express consent applies to the industry at large, the FTC has apparently settled the matter. As explained in its business blog, the BetterHelp case offers “a key guidance point for other companies: Honor your privacy promises. Tell the truth and get consumers’ affirmative express consent before sharing any health information.”

In both the BetterHelp and GoodRx cases, the FTC went beyond typical “deceptive” claims to also include “unfair” accusations against the firms for not obtaining “affirmative express consent before collecting, using and disclosing consumers’ health information,” as well as for not implementing reasonable privacy measures to protect such information.

Given the settlements, neither of the FTC’s positions have been tested in court. Nevertheless, the “unfair” allegations “indicate that these are standards that the FTC apparently expects all companies to adhere to,” opined the WilmerHale attorneys.

The FTC complaints also take BetterHelp and GoodRx to task for including seals on their websites implying they had been certified as “HIPAA-compliant,” even though the companies had received no such certification. “Have you checked your site recently for graphics that could send similar deceptive messages?” the agency warned.

The WilmerHale team sees in both instances a crackdown on so-called “dark patterns” in their respective user interfaces. For instance, the BetterHelp complaint notes that visitors to the firm’s websites were “urged to begin the Intake Questionnaire and hand over their health information.” By contrast, the company’s privacy policy could only be found “in small, low-contrast writing” at the bottom of the page.

But BetterHelp’s proposed order breaks new ground. For one, the future ban on sharing consumer data is quite broad. It leaves the firm essentially unable to use any personal information for advertising, not just so-called “health information.” 

Nor are there carve-outs for certain kinds of ad-related activities, like contextual ads or ad effectiveness analytics. The commission in BetterHelp’s case also, for the first time, spelled out the elements that must be disclosed to an individual to establish affirmative express consent. 

And the FTC dismissed BetterHelp’s use of hashing, a process by which a visitor’s or user’s email address is encrypted via an alphanumeric code. The FTC alleged that BetterHelp “knew that [the] third parties…were able to, and in fact would, effectively undo the hashing and reveal the email addresses of those Visitors and Users.” 

“Companies that transfer personal information to third parties should not attempt to paper over poor data privacy practices solely with technical safeguards like hashing,” WilmerHale’s lawyers urged.

The two enforcement actions imply that the commission is on the lookout for ways to stem the tide of consumer health information for advertising purposes in other settings. That was certainly the gist of the FTC’s August 2022 Advance Notice of Proposed Rulemaking on commercial surveillance and data security, which suggested that the regulator has broader aspirations to regulate how data-driven pharma advertising operates in this country. 

Which company is next in the FTC’s sights, and how many websites could be tagged, are anybody’s guesses. Meanwhile, in the wake of the two cases, affirmative express consent is likely to be on FTC’s enforcement radar for the foreseeable future. 

Other firms in the digital health space might consider whether they too should be seeking it. They may also want to consider removing their Meta pixel tools and stop sharing data for advertising purposes. 

Then again, data is at the core of digital health companies’ value to the employer-sponsored plans and payers into which they sell. Their clients want data back that shows their employees or members are being engaged — and getting better.

“All of that requires data, so there’s no way to just shut down the flow of data to third parties,”  said Flaherty. “I don’t think any of these companies can survive if they don’t share data.”