Crazywhy

Authentic and opinionated magazine

Top Tags

Flo Health Lawsuit Explained: How Your Period Tracker Could Be Sharing Data Without Consent

Jul 28, 2025

In today’s digital era, smartphones have become far more than simple communication devices—they are private data vaults, emotional diaries, and intimate health journals. What once was shared only with doctors or close confidants is now entered, often casually, into apps that promise convenience and personalized insights. Among these, health-tracking applications hold a special place in the lives of many users, particularly women who rely on apps to monitor their reproductive health. Yet, as the recent legal firestorm surrounding Flo Health illustrates, this trust is fragile and easily betrayed.

Flo Health is the company behind one of the world’s most popular menstrual tracking apps, widely known as “Flo.” With more than 165 million installations and over 38 million monthly active users, the app has achieved mass adoption across the United States. It offers a suite of tools designed to help women log their cycles, predict ovulation windows, track symptoms, and even monitor pregnancy. For many users, especially young women in urban America juggling work, family planning, and personal health, Flo has become a digital health companion—a pocket-sized algorithmic gynecologist.

But as is often the case with convenience, the hidden costs lie in what users do not see. In 2019, a Wall Street Journal investigation revealed that Flo was transmitting sensitive user data—such as menstrual cycles, sexual activity logs, and reproductive patterns—to third-party companies like Google, Meta (then Facebook), and Flurry, an advertising analytics firm. What made this practice especially alarming was that Flo had explicitly told users their data would remain private. The trust placed in the app had been based on that assurance. In reality, the app’s underlying architecture was riddled with software development kits (SDKs) that quietly channeled user behavior to a constellation of unseen corporate entities.

These SDKs, while a standard feature in app development, have become a privacy Trojan horse. They enable developers to plug in powerful functionalities—ad integration, usage analytics, and social sharing—without reinventing the wheel. But in doing so, they often siphon data off to their original vendors. Flo’s incorporation of SDKs allowed companies like Meta to collect information from the app and potentially use it to build ad targeting models—without explicit consent or even awareness from users. The implications are staggering. For example, a woman tracking her fertility might start seeing ads for baby products on Facebook, not realizing that her private health behavior had been silently monetized.

The response from regulators was swift, though arguably insufficient. In 2020, the Federal Trade Commission (FTC) launched a formal investigation and concluded that Flo Health had engaged in “deceptive commercial practices” by promising privacy while actively sharing user data. Flo settled with the FTC in January 2021, agreeing to amend its data practices, submit to third-party audits, and ask recipients of its previously shared data to delete what they had received. However, the FTC did not impose a financial penalty, and perhaps more importantly, it did not mandate restitution for affected users.

That vacuum quickly gave rise to a private class-action lawsuit filed in California under the name Frasco v. Flo Health. This civil suit went far beyond mere allegations of deceptive practice. It claimed that Flo had violated several provisions of California law, most notably the California Confidentiality of Medical Information Act (CMIA) and the California Invasion of Privacy Act (CIPA). The lawsuit represented all U.S.-based users who had interacted with Flo from June 2016 to the present and sought not just policy reform but monetary damages. Unlike the FTC’s administrative process, this legal route demanded tangible accountability—actual compensation for the unlawful use of health data.

At the heart of the legal conflict lies a profound gap in U.S. data protection law. The Health Insurance Portability and Accountability Act (HIPAA), long viewed as the bedrock of medical privacy in America, simply does not apply to apps like Flo. HIPAA governs traditional healthcare providers—doctors, hospitals, insurers—not consumer tech companies offering “wellness” services. This regulatory void means that an app can gather extremely sensitive information about a user’s health, behaviors, and body without falling under any robust federal privacy law.

California, often seen as America’s privacy law innovator, is one of the few jurisdictions to try and fill that gap. The CMIA extends protection to medical information beyond traditional healthcare settings, and CIPA allows plaintiffs to sue if their communications are intercepted without consent. These laws were pivotal in enabling the Frasco case to proceed. In fact, similar legal theories were successfully tested in recent years, including a high-profile 2022 case (Saleh v. Nike), where a federal judge ruled that a website’s use of session recording software could constitute illegal wiretapping under CIPA.

As of early 2025, the Frasco lawsuit continues to evolve. In March 2023, Flurry—the now-defunct analytics company—agreed to a $3.5 million settlement, marking one of the first financial consequences in the case. Google filed its own settlement notice in July 2025, opting out of an impending jury trial, though the exact terms of its settlement remain confidential. The most closely watched defendant, Meta Platforms Inc., has yet to settle. The complaint alleges that Facebook tracking code embedded within the Flo app captured user inputs, including health data, and transmitted it to Meta’s servers for commercial use. If the court rules against Meta, the case could set a major precedent regarding whether and how digital surveillance tools fall under existing privacy statutes.

This uncertainty underscores the urgency for comprehensive federal data privacy reform. A national law—something akin to Europe’s General Data Protection Regulation (GDPR)—would give American users clearer rights and companies stricter boundaries. At present, U.S. consumers face a confusing patchwork of protections that vary not just by state but by data type and business model. A woman in New York using Flo might have very different legal recourse than a woman in California, even if their data was treated identically.

Beyond the legal labyrinth, there is a moral reckoning underway. When a person logs data about their period, fertility, or sexual activity into an app, the expectation isn’t just technical—it’s ethical. These data points are not casual browsing habits or shopping preferences; they are intimate expressions of bodily autonomy. Turning them into fuel for profit—without informed, ongoing, and meaningful consent—crosses a fundamental boundary.

The rise and fall of trust in apps like Flo reveal a broader lesson about the evolving relationship between people and technology. As Americans continue to integrate digital tools into their lives, the need for structural safeguards becomes paramount. Transparency must be more than a checkbox in a privacy policy. Consent must be more than passive agreement. And accountability must go beyond press releases and settlements—true justice demands enforceable rights.

The Frasco case may not, on its own, settle the future of digital health privacy in America. But it is already shaping how companies, regulators, and courts think about the responsibilities that come with collecting and using sensitive personal data. If the law fails to keep pace with technology, it will be lawsuits like this—and the courage of individual plaintiffs—that draw the lines others refused to mark.