
30 March 2026
Privacy and Data Security in FemTech
FemTech - “Female Technology” - is emerging as one of the most dynamic segments in digital health. One study has suggested that over 50 million people worldwide use some form of menstrual cycle app. According to Forbes, FemTech represents a USD103 billion market, while Galen projects a USD360 billion opportunity in women’s health. These figures highlight rising M&A activity in the sector and venture investment interest. Yet this acceleration is colliding with a complex and fragmented global regulatory environment posing significant challenges for FemTech innovators.
Our FemTech Now article series aims to give stakeholders a comprehensive understanding of how to bring FemTech products and services to market responsibly, lawfully and successfully. The series examines IP issues, structuring collaborations and partnerships, investment considerations, and the governance challenges posed by AI and machine learning. For now, we are taking a closer look at privacy and data security.
Protecting data is central to FemTech’s growth. The data collected by such apps is inherently private. In fact, few things are more intimate than what can be shared on them. There’s also a consideration that many FemTech products will involve vulnerable individuals requiring special consideration as a category of data subject.
Context
The sensitive data behind FemTech raises important privacy and ethical issues
FemTech refers to technology-based products, services, and software designed specifically to address women’s health needs. It includes mobile apps, wearable devices, telemedicine platforms, and diagnostic tools that focus on areas such as menstrual health, fertility, pregnancy, menopause, sexual wellness, and hormonal care. It also includes platforms that enable remote consultations for contraception, fertility, and general women’s health, bridging gaps for those in underserved areas. These solutions aim to close gaps in traditional healthcare by providing personalised, accessible, and proactive health management for women.
Clearly, the data collected via these products is inherently private. But it’s also broad, and in some cases, commercially rich. Examples include personal identifiers (name, age, location); health and biometrics (menstrual cycle details, ovulation status, hormone levels, pregnancy history, and symptoms such as pain or fatigue); lifestyle information (sexual activity, contraceptive use, diet and exercise habits); and other special category data (such as data that can reveal religious beliefs, sexual orientation, or cultural practices).
This data (which is often continuously collected over a number of years) is processed to deliver personalised lifestyle and health recommendations, improve user experience, and, in some cases, shared with third parties for analytics or advertising. The processing activities associated with such data can involve profiling and / or monitoring to allow the products to operate as effectively as possible. While these practices can enhance functionality, they also raise significant privacy and ethical concerns.
Operating in FemTech means dealing with overlapping legal obligations
FemTech sits at the intersection of data protection law, consumer protection law, cybersecurity law, and medical device regulation. Providers of FemTech products and solutions have to navigate a dense, multi-layered regulatory environment. Below we summarise some of the most onerous relevant requirements in the UK and EU.
| Legal Framework | Application to FemTech | Key Requirements |
| EU and UK GDPR |
Applies to FemTech processing users’ personal data. Most menstrual, fertility, pregnancy and sexual-health data is “special category data” which can only be lawfully processed under certain conditions. |
Transparency and reasonable expectations of users: Users reasonably expect heightened confidentiality for intimate data. Privacy notices must clearly explain what data is collected, why, the lawful bases relied on, retention periods, what categories of third parties receive the data, what transfer safeguards are in place, and what the user rights are. Misalignment between stated practices and actual data processes and/or flows carries significant compliance and reputational risk. Risk assessments: Processing health data (combined with information relating to sexual activity in particular) is inherently high‑risk. FemTech providers must complete a data protection impact assessment (DPIA) before beginning such processing, documenting necessity, proportionality, risks to users, and the measures taken to mitigate those risks. This should consider the two layers of processing activity: (i) a lawful basis under Article 6 and (ii) a separate Article 9 condition for special category data; typically, explicit consent where providers of FemTech cannot rely on other conditions relating to medical diagnoses due to their commercial models. The standard for consent in the EU/UK is high and should not be bundled with terms or used to gate non‑essential features of a FemTech solution. Children’s data: Children’s personal data may be in scope where minors use FemTech services to record menstrual cycles or other symptoms relating specifically to female health issues (eg abdominal cramps, mood swings, or use of contraception). Depending on the jurisdiction and age, parental consent may be required. FemTech products used by minors must apply stricter data minimisation, provide clearer notices, and implement stronger security, and should carefully assess profiling or automated decision‑making. |
| UK Data (Use and Access) Act 2025 (DUAA) |
Following Brexit, the UK retains the GDPR in domestic form (UK GDPR) alongside the Data Protection Act 2018. A new law was introduced (DUAA) last year, and key relevant changes introduce a more permissive framework for automated decision‑making, with mandatory safeguards. The prohibition on solely automated, significant decisions is now limited to cases involving special category data (which is highly relevant for FemTech providers). |
From a practical perspective, the DUAA will require providers to review current GDPR practices and review for deviations under UK law. The main considerations will be:
|
| EU Data Act | The Data Act introduces rules defining who can access and use data generated by connected products and related services (including FemTech wearables, smart thermometers, hormone sensors, and related apps). |
Users of FemTech products and services gain new rights to access and share “product data” and “related service data” generated by their devices. Manufacturers must design devices with “access by design” so users can easily port data to third‑party services. Cloud providers supporting FemTech systems must enable easy switching and interoperability. Conversely, providers of FemTech should revisit their contracts with such cloud providers (given there are helpful rights set out in the Data Act which can be used to prevent technical “lock-in” with one provider), with a vie to fostering innovative growth using a potentially new provider. |
| EU AI Act | Many FemTech applications, such as ovulation prediction, hormone‑response models, symptom classification, or cycle irregularity detection, may qualify as high‑risk AI systems if they are medical in nature. |
Providers of FemTech that uses AI must, as a minimum, have procedures in place for:
|
| Medical Device Regulations (EU MDR and UK MHRA Regime) |
FemTech apps and devices can be classified as medical devices when their intended purpose is to diagnose, predict, monitor or guide reproductive‑health decisions, such as fertility‑window prediction, PCOS or cycle‑risk flagging, hormone‑level interpretation, pregnancy‑risk scoring, or any AI‑based diagnostic or clinical‑support function. Under the EU Medical Device Regulation, software with these purposes is typically classified as Class IIa or higher, (particularly under Rule 11, which governs software used for diagnosis or therapeutic decisions). In the UK, the Medicines and Healthcare products Regulatory Agency (MHRA) applies similar principles: software becomes a medical device when it’s intended for diagnosis, monitoring or treatment, and must comply with the UK medical device framework, supported by MHRA’s dedicated Software and AI as a Medical Device guidance. |
Obligations for both regimes include:
|
| NIS2 |
NIS2 applies to medium and large companies operating in specific sectors listed in Annex I – Essential (sectors of high criticality, which includes Health) and Annex II – Important (other critical sectors, which includes manufacturing, digital services and research and development institutions with a critical input into the Health sector). Whether a FemTech provider is an Essential or an Important entity will depend on the nature of the services. For example, ‘healthcare providers’ are classed as Essential entities, however FemTech software applications will generally fall under the ‘digital providers’, making them Important entities. |
Different obligations apply to Essential v. Important Entities. In-scope organisations must implement stronger, risk‑based cybersecurity controls across their platforms and services, including multifactor authentication, encryption, continuous monitoring and logging. They must also establish robust incident‑reporting processes, ensuring that significant security events are notified to authorities within strict timelines (both essential and important entities must follow the same strict 24 and subsequent 24-hour notification timelines). |
| CRA |
The CRA places obligations on organisations involved with the manufacturing, importing and distribution of products with digital elements. A product falls within scope if it is:
Femtech providers that manufacture or distribute products with digital elements on the EU market, such as connected fertility trackers, wearable health monitors, or apps with remote data processing functionality, will likely be caught by the CRA. |
In practical terms, FemTech providers acting as manufacturers must ensure their products meet the CRA’s essential cybersecurity requirements throughout the product lifecycle, to ensure safety, security, and traceability. This includes conducting pre-market risk assessments, applying CE marking, implementing vulnerability management processes with a minimum five-year support period, and reporting actively exploited vulnerabilities to ENISA within strict timeframes. Certain obligations (e.g. conformity assessments) are different for ‘standard’ v. ‘important’ or ‘critical’ products. Providers acting as distributors must verify that products bear a CE marking and that upstream obligations have been fulfilled before making products available on the EU market. |
Commercial consequences of getting it wrong
Personal data breaches
FemTech companies must adopt state‑of‑the‑art cyber‑resilience measures including robust encryption, continuous monitoring, red‑team testing and mature incident‑response programmes. By way of example, the CRA places explicit security obligations on organisations responsible for FemTech products. These include measures such as: (1) securing the product by design and by default; (2) vulnerability management; (3) technical support requirements; and (4) incident reporting. The GDPR applies broader security obligations as well.
But the focal risk for businesses is not just a legal one. In a space where data is deeply personal and the stakes are uniquely high, security isn’t just a technical requirement, it’s a critical pillar of user safety and goes to the heart of revenue protection and customer trust.
FemTech providers face an acute cybersecurity risk because the data they hold is both highly intimate and commercially valuable. This makes it an attractive target for threat actors. Fertility logs, cycle histories, sexual‑activity records, hormone data and pregnancy‑related information can be exfiltrated and misused in ways that go far beyond traditional identity theft, exposing users to profound emotional distress or stigma if their most private information is compromised. Impacted individuals may even feel violated or lose trust not only in their specific app, but in digital health technologies more broadly. The consequences of failing to ensure appropriate security measures therefore extends beyond to regulatory scrutiny, to reputational damage, litigation exposure and long‑term erosion of user confidence.
Advertising and targeting
Using special category data for behavioural advertising (eg menstrual cycles, fertility windows, sexual‑activity patterns or hormonal data) is generally high risk and may be unlawful without a valid Article 9 condition. Essential product analytics should be separated from advertising, and health data shouldn’t be used for targeting without explicit, informed consent.
For providers of FemTech, these risks are amplified by the emotional vulnerability and personal significance of this data. Advertising based on cycle information or inferred fertility status can easily cross into exploitative territory. For example, marketing baby‑related products at users believed to be “trying to conceive,” or targeting pregnancy‑related content at someone experiencing irregular cycles, fertility challenges or recent loss. The assumption that being in a “fertile window” equates to wanting or being able to have a baby is inaccurate and potentially distressing.
This creates substantial reputational and ethical risk for FemTech providers: an insensitive or misaligned advert can cause emotional harm, erode user trust, or even spark public backlash. Given the history of surveillance and stigma surrounding women’s reproductive choices, users reasonably expect heightened protection from intrusive profiling and commercially driven "nudging". FemTech providers should therefore treat advertising in this space with caution and fairness, ensuring practices respect user dignity and avoid reinforcing harmful or presumptive narratives about women’s health.
Misuse of data – data sharing
FemTech data also carries a significant risk of misuse by downstream recipients, particularly where it reveals or can be used to infer an individual’s reproductive capacity, fertility challenges, or hormone‑related conditions.
Sharing such data (whether lawfully, unlawfully or through opaque “research” or “analytics” partnerships) could lead to exploitation by insurers seeking to assess perceived health risk; by employers analysing absenteeism or productivity patterns; or by healthcare stakeholders making assumptions about a user’s reproductive intentions or medical history.
Parallels can already be seen in how some insurers incentivise policyholders to share health data from wearable devices, offering discounted premiums for meeting activity targets. Unlike step counts, menstrual cycles, ovulation patterns and fertility metrics are deeply personal, highly variable, and often not within the user’s control. This makes any form of behavioural "nudging" especially problematic. If FemTech data were used to justify higher premiums, differential access to treatment, or employer decision‑making, the consequences would go beyond privacy intrusion to discrimination. This underscores the need for strict data‑minimisation, purpose‑limitation, and robust governance around all downstream sharing.
There are case studies where a female’s purchasing decisions have led to an inference about their pregnancy status, which led to marketing decisions that caused a premature announcement to the female’s family. This scenario highlights the importance and potential harm of seemingly anodyne profiling of consumer purchasing decisions.
What next?
FemTech sits at a pivotal moment. The sector’s promise (to close long‑standing gaps in women’s health and deliver more personalised, accessible and empowering care) is immense. But its success hinges on something far more delicate than technical capability or market momentum: trust. As the regulatory landscape evolves and the sensitivity of reproductive‑health data becomes more visible in legal, social and political debates, FemTech providers must recognise that privacy and data governance are essential to sustainable innovation, and just not peripheral compliance tasks. Providers of FemTech that invest early in good data governance and user‑centred design will not only reduce significant legal, ethical and commercial risk; they will also differentiate themselves in a crowded market by demonstrating respect for the dignity, autonomy and lived experiences of their users.