A first-of-its-kind enforcement action by the Federal Trade Commission threatens to become a repeat occurrence for the digital health and advertising sectors.
This month the agency said it reached an agreement with digital health platform GoodRx on a $1.5 million fine and remedies, after accusing the telehealth and drug coupon provider of sharing patient data with third parties like Google and Facebook for advertising purposes. A federal court in the northern district of California must still approve the agreement.
Experts say the matter is not only noteworthy from a legal point of view. Although the main law that the case was brought under was the FTC Act, the agency’s penalty invokes its previously unenforced health breach notification rule (HBNR). It’s also part of a trend toward tightening the reins on the common practice of selling online patient data for marketing.
GoodRx, which offers virtual doctor visits and lets users get discounts on prescription drugs, agreed to pay the fine without admitting any fault. But there are numerous health and wellness sites, apps, wearables and online businesses that, as a matter of course, give their data over to Meta or Google for targeted advertising, and the fresh threat of a monetary penalty could curtail that business model.
“I hope people are looking at this and saying, ‘We’ve got a good, strong business here. Is having that data that leaks out the back to make a little extra money on this, is that smart and worth doing for us?’” said Eric Perakslis, chief science and digital officer at Duke University’s Clinical Research Institute.
GoodRx, as per the FTC’s complaint, allegedly deceived its customers by informing them that it complied with the HIPAA health privacy law, which in fact doesn’t apply to it, and by promising it would never share personal health information with advertisers, although it did. The commission also faulted the company for failing to have adequate internal guardrails around its patient data and how much gets shared with third parties.
The firm acquiesced to pay the penalty without admitting any malfeasance. In a statement, GoodRx said it doesn’t think the requirements detailed in the settlement will have a material impact on its business. The company’s press people referred MM+M’s request for an interview to a blog post, in which it added, “We believe this is a novel application of the [HBNR] by the FTC. We used Facebook tracking pixels to advertise in a way that we feel was compliant with regulations and that remains common practice for many websites.”
Pending the district court’s approval of the settlement, GoodRx would be barred from sharing users’ health data with third parties for advertising. Under the FTC Act, the agency has the authority to levy the $1.5 million penalty.
Its order against GoodRx comes on the heels of two other health data-related enforcement actions. One was FTC’s lawsuit against data broker Kochava last summer, over the alleged sale of data which could be used to tell whether a person had visited an abortion clinic. The other was its 2020 action against period-tracking app Flo, for allegedly sharing data with Facebook and Google in violation of its own privacy rules.
While those cases stemmed from FTC’s long-standing authority to police unfair and deceptive commercial behavior, in the GoodRx matter, the agency took a fresh tack on invoking the 2009 breach rule. It’s unprecedented on multiple levels.
“The agency filed suit and came to a stipulated settlement based in part on the [HBNR] that had been promulgated in 2009, but has seldom been enforced publicly until now,” said Beth Roxland, J.D., M.Bioethics, a strategic advisor on law, policy and ethics at Roxland Consultants.
Moreover, “part of what the FTC was going after was the sharing of identifiable, sensitive health data by a non-HIPAA-covered entity,” Roxland pointed out, “an increasingly frequent practice with the rise of direct-to-consumer health and wellness technologies and apps – contrary to what consumers generally know and understand. We’re conditioned to think of health-related platforms, and certainly things we encounter interfacing with clinicians and pharmacies, as adhering to the strict privacy protections like HIPAA-covered entities.”
While enforcement of HIPAA falls under the auspices of the Department of Health and Human Services, Roxland noted, this lawsuit, and the HBNR itself, makes clear that the FTC is using its authority under the FTC Act and the American Recovery and Reinvestment Act of 2009 to pursue non-HIPAA-covered entities that possess and potentially share highly sensitive health and personal information in ways that consumers may not be aware of and have consented to.
“While the dollar figure of the settlement could certainly be higher, the stipulation for a permanent injunction also contains a host of prohibitions and unique requirements on GoodRx that are very noteworthy both from a legal and corporate policy standpoint,” Roxland added. “This might be the first of many, many actions if the FTC does make good on its pronouncements.”
Last year the agency released a proposal to regulate data-driven surveillance marketing, and its draft rules drew about 11,000 comments, more than 100 of them opining on health or healthcare marketing. An unnamed official said during a media briefing that the FTC is reviewing the comments and intends to be “aggressive” in enforcing this area.
The HBNR requires vendors of personal health records and related entities to notify consumers following a breach involving unsecured information. But until now, the FTC hadn’t used it to bring an action against a company. In 2021, though, FTC said it had widened the lens through which it interprets the rule’s scope to include mobile apps and other connected devices, like wearables.
The information GoodRx shared included its users’ prescription medications and personal health conditions, personal contact information, and unique advertising and persistent identifiers, per the complaint. As FTC officials subsequently explained, any individually identifiable information that can be derived from an individual’s activities to reveal their health conditions can be considered sensitive health information – browsing data and app usage included.
The proliferation of health and wellness apps, as well as third parties like Google and Facebook, are all implicated, FTC added during the media briefing. Under such an enforcement scheme, even the aforementioned entities that are not covered by HIPAA still have a duty under the breach rule to notify consumers and obtain consent for such disclosures.
The agency basically wants the public to be able to make choices consensually. While a person very well may decide that the cost savings is worthy of a measure of information sharing, opaque policies and lack of disclosure prevents consumers from making that choice, Roxland argued.
“I am confident that a sizable percentage of consumers would have foregone the benefits of using GoodRx’s coupons and other services had they known about the company’s sieve-like data practices,” FTC commissioner Christine Wilson wrote in a concurring opinion, adding that the civil penalty didn’t go far enough.
The FTC fine also follows studies showing how sensitive patient data moves from digital health platforms to Facebook, such as a June 2022 investigation by The Markup and STAT which revealed that hospitals were leaking this kind of data to the web giant. A lawsuit was recently brought against Cedars-Sinai Medical Center alleging impermissible data disclosures to Google and Meta, one of dozens of suits filed against healthcare organizations in the investigation’s wake.
And at the end of December, the HHS issued an updated guidance stating that, in order to use online tracking technologies, HIPAA-covered entities first need to have a business associate agreement (BAA) for disclosing information to those vendors, including for marketing purposes.
If a covered entity shares data without a BAA in place, under the HBNR that’s now technically considered a breach and needs to be reported. Although there are still loopholes in BAAs, whereby de-identified health data coming from a second or third party wouldn’t necessarily be prohibited, HHS in effect firmed up some loose points in its guidance.
“In my mind, I’m starting to see momentum to try and reel some of this in,” said Perakslis.
He views the progression of enforcement as the beginning of a public backlash against data brokers, which have been enabled by ad tech. In the last election cycle, 35 states introduced privacy laws, Perakslis pointed out. Only five passed, but half of them included explicit language about data brokers operating in their states, including some which required brokers to join registries.
“People understand there’s a lot of harm that comes from lifting a lot of data and feeding algorithms,” he said. “Algorithms have no conscience.”
During President Biden’s state of the union speech last week, he called for holding Big Tech more accountable for the effects of social media on children and adults. Interestingly, he said the onus is on the tech platforms to prove they’re not causing harm, a twist on the usual caveat emptor principle, where it’s usually up to the consumer to prove they were harmed by a product.
And evidence is mounting about algorithmic harm to children. Consider the recent suicide of 14-year-old U.K. resident Molly Russell. Hers is the first time social media was directly implicated as a cause of death.
According to the coroner’s report, the algorithms used by Instagram and Pinterest selected content for Russell, which in turn contributed to the death of the depressed teen “in a more than minimal way.” Her case follows a longer public discusson about the effects of social media on children.
Perakslis advised CIOs to become more in tune with what’s in their tech stacks. “As someone who’s been a CIO at Johnson & Johnson and FDA, I guarantee you that most organizations do not know all of the code that’s in their stack. They have no idea what’s really running the ghosts in their machine…so accidents can happen.”
The trend also points to using compliance as a business opportunity. “I advise and am on the board of multiple small tech startups,” said Perakslis, “and some of them start their meetings by saying, ‘We will never sell secondary data.’ It’s one of the ways they’re differentiating themselves.”
In a 30-person company, where the engineers actually know what’s in the tech stack, that’s doable, he said. Fortune 500 companies, not so much.
There is more than sufficient information associated with the lawsuit to suggest that the FTC intends to follow through on its intention to be the “new sheriff in town” and use this breach notification rule for non-HIPAA-covered entities that nevertheless possess very sensitive health data and then use, sell, share or monetize the information.
Among its novel provisions, the stipulation defines requirements for obtaining a person’s “affirmative express consent,” delineates the categories of personal and health information to which the settlement’s compliance mandates apply, and sets forth requirements for notifying consumers in the event of a breach.
Although many non-HIPAA entities in the digital health and wellness space have very different business practices and procedures, Roxland suggested that they review their internal privacy policies, disclosures and consent procedures, as well as business operations, in light of the stipulation’s comprehensive definitions, detailed policy mandates and compliance schemes.
It’s hard to know if the FTC’s more aggressive stance will have a chilling effect and what form that could take in terms of websites revamping their policies or terms & conditions, specifically consent and upfront notification of data sharing. But if FTC brings additional suits of this nature, it could, although whether that will have a positive impact on consumers, most of whom aren’t necessarily conditioned to look for such legalese, is another story.
“My bigger hope,” Roxland said, “is that coverage of this incident by journalists and popular media – which is actually mandated under specific circumstances both in the HBNR and in the settlement – will put consumers on notice that they should proactively consider what information they provide to consumer-facing health technologies that may be sharing or using that data for purposes beyond the marketed value to the consumer.”