
6 May 2026
Global Perspectives on FemTech – Europe, Australia, and the United States
FemTech is one of the fastest‑growing segments of the digital health ecosystem. In the EU, Australia and the US, innovation in this space operates within a highly regulated legal environment, particularly where products cross the line from general wellness into healthcare.
This article outlines the key EU, Australian and US regulatory considerations for FemTech companies.
The European regulatory landscape for FemTech
The regulatory treatment of FemTech solutions in the EU depends primarily on intended purpose, as defined by the manufacturer, and on how the product functions, how it’s marketed and how it interacts with health data. This distinction has significant consequences not only under medical device regulation, but also under data protection, cybersecurity and consumer protection frameworks.
In the EU, the key regulations are Regulation (EU) 2017/745 on medical devices (MDR), Regulation (EU) 2017/746 on in vitro diagnostic medical devices (IVDR), relevant European Commission/Medical Device Coordination Group (MDCG) guidance, and the increasingly critical privacy and cybersecurity obligations applicable to FemTech products.
Medical devices or wellness technologies? The decisive role of intended purpose
Under the EU regulatory framework, software – including mobile applications, AI‑driven tools and connected devices – may qualify as a medical device if the manufacturer intends it to be used, alone or in combination, for a medical purpose. This includes for the diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease.
For FemTech products, this qualification is often not straightforward. Many solutions operate on the boundary between wellness and healthcare, offering features such as menstrual tracking, fertility awareness, sleep monitoring or nutrition guidance. In these cases, the manufacturer’s intended purpose, as expressed in product documentation, instructions for use, marketing materials and app‑store descriptions, becomes critical.
By contrast, wellness apps and wearables, aimed exclusively at promoting a healthy lifestyle or general wellbeing (eg cycle tracking for general awareness, mindfulness tools, sleep or nutrition logging without health claims), typically fall outside the scope of the MDR and IVDR, provided they don’t claim a medical purpose. And FemTech products intended to diagnose, predict or treat medical conditions – for example, software used to support infertility treatment decisions, predict pregnancy complications, detect hormonal disorders, or analyse health data for clinical decision‑making – are likely to be classified as medical devices and subject to full regulatory oversight.
EU guidance consistently emphasises that functionalities such as personalised health risk assessment, symptom analysis, predictive modelling, or decision support for healthcare professionals or patients substantially increase the likelihood that software will be regarded as a regulated medical device.
Classification under the MDR and IVDR: Why it matters
If a FemTech solution qualifies as a medical device, correct risk classification under the MDR is essential. Software medical devices are commonly classified under Rule 11 of Annex VIII MDR, which may result in Class IIa, IIb or even Class III classification depending on the clinical impact of decisions supported by the software.
Similarly, FemTech products relying on the analysis of biological samples (for example, hormone testing kits or companion apps linked to at‑home diagnostics) may fall under the IVDR, triggering an even more stringent conformity assessment regime.
Incorrect classification or failure to comply with applicable conformity assessment procedures can have serious consequences, including prohibition of market access in the EU, regulatory enforcement action by national competent authorities, product recalls and reputational damage, or even civil liability exposure.
Manufacturers have to engage with MDR/IVDR classification and conformity requirements early in product development, rather than treating regulatory compliance as a post‑launch issue.
The role of EU guidance on software and medical device qualification
The European regulatory framework is supplemented by extensive guidance issued by the European Commission and the MDCG, including guidance on qualification and classification of software as a medical device; standalone software and apps; and borderline products between medical devices and lifestyle/wellbeing tools.
These guidance documents consistently underline that the absence of physical contact with the body doesn’t exclude software from medical device status, and manufacturers can’t avoid regulation simply by labelling a product as “informational” if its functionality effectively influences medical decisions.
For FemTech companies, alignment between technical functionality, clinical claims and regulatory positioning is essential to withstand scrutiny by notified bodies and competent authorities.
Privacy and data protection: Health data at the core of FemTech compliance
Regardless of whether a FemTech product qualifies as a medical device, almost all FemTech solutions process highly sensitive personal data, including information relating to reproductive health, menstrual cycles, fertility, pregnancy status or hormonal conditions.
Under the General Data Protection Regulation (GDPR), this data typically qualifies as special category data, subject to enhanced protection and strict conditions for lawful processing.
Key compliance considerations include:
- Identifying a valid legal basis and Article 9 GDPR condition for processing health data (often explicit consent, but not always).
- Ensuring transparency regarding purposes of processing, data sharing and retention.
- Limiting secondary uses of data, including analytics, advertising or product development.
- Addressing risks associated with inferred or derived health data, not only data actively provided by users.
FemTech companies also have to be mindful that GDPR obligations apply extraterritorially, meaning non‑EU providers might still fall within scope where their products target users in the EU.
Recent enforcement trends in Europe show growing regulatory sensitivity towards reproductive health data, particularly where data is monetised or shared with third parties in ways that users might not reasonably expect.
Cybersecurity and digital resilience obligations
Cybersecurity has become a central regulatory concern for digital health technologies. For FemTech products, cyber risk is particularly acute because of the sensitivity of the data involved and the potential physical and psychological harm resulting from data breaches or system manipulation.
At EU level, FemTech companies could be affected by multiple cybersecurity‑related regimes, including:
- The MDR’s general safety and performance requirements, which explicitly include protection against unauthorised access and data breaches for medical devices.
- The NIS2 Directive, for certain digital health service providers.
- Emerging horizontal frameworks such as the Cyber Resilience Act, which will introduce mandatory cybersecurity requirements for connected products.
Manufacturers of FemTech medical devices have to integrate security by design and by default, including vulnerability management, secure software updates and incident response mechanisms.
Consumer protection and transparency risks
FemTech products – whether regulated medical devices or not – are subject to EU consumer protection and unfair commercial practices law. This is especially relevant where apps or AI‑driven tools provide outputs that users might rely on for personal health decisions.
Claims relating to accuracy, effectiveness or health benefits have to be substantiated, and algorithmic outputs should be properly framed to avoid misleading impressions of medical certainty where none exists.
Why regulatory strategy is central to FemTech success in the EU
For FemTech innovators, the EU offers a large and attractive market, but one that demands careful regulatory positioning and strong governance. Correctly distinguishing between medical devices and wellness technologies, aligning functionality with intended purpose, and embedding privacy and cybersecurity compliance into product design are no longer optional.
Early and strategic engagement with EU regulatory requirements – across medical device law, data protection and cybersecurity – is essential not only for legal compliance, but also for building user trust in technologies operating at the most intimate intersections of health and data.
Products frequently sit at the intersection of general wellness and regulated healthcare, while involving the collection and use of highly sensitive health data. Although legal frameworks vary in structure and emphasis, recurring themes include the central importance of intended purpose, increasing regulatory scrutiny of AI‑enabled health functionalities, and the expectation that privacy, cybersecurity and consumer protection considerations are addressed at the design stage.
Against this backdrop, the Australian perspective illustrates how these issues are addressed within a distinct legal and regulatory environment.
The regulatory landscape for FemTech in Australia
In Australia, the regulatory treatment of FemTech products depends on how the technology is framed and marketed, how it functions and the type of data it collects.
The Therapeutic Goods Administration (TGA) regulates software that meets the legal definition of a medical device. Medical devices can include any product or software (including AI-enabled software) that the supplier intends to be used for a therapeutic purpose in or on human beings. Where that threshold is met, the software must generally be included on the Australian Register of Therapeutic Goods (ARTG) before it can be lawfully supplied.
Correct classification is important as it determines whether a FemTech product is captured by Australia’s medical device regulatory framework and, if so, whether it has to comply with the obligations associated with ARTG inclusion. In assessing software, the TGA looks beyond form to intended purpose and likely clinical impact. This distinction is particularly relevant for FemTech, where products often sit at the intersection of general wellness and medical care.
TGA Guidance emphasises that regulation turns on intended purpose, and that health/lifestyle apps are generally not regulated unless they meet the definition of a medical device. Depending on functionality and positioning, certain apps may be characterised as providing a health service. But software that goes beyond passive tracking and begins to analyse symptoms, generate predictive health outcomes, or influence clinical decision-making is more likely to be classified as a regulated medical device. For example, an app that predicts ovulation for the purpose of fertility treatment as opposed to general awareness may trigger regulatory oversight.
Getting the classification wrong can have serious consequences, including where software that meets the medical device definition is supplied without first being included in the ARTG. Beyond classification issues, FemTech products that mishandle sensitive health information face heightened exposure under Australia’s strengthened privacy framework, including expanded regulatory enforcement powers and private claim pathways introduced through recent reforms.
Privacy and data protection are central to FemTech compliance
Alongside therapeutic goods regulation, FemTech products in Australia are subject to a stringent privacy regime. Depending on their functionality and positioning, some apps may be characterised as collecting health information under the Privacy Act 1988 (Cth) (Privacy Act).
For example, information collected from ovulation and fertility apps relating to an individual’s menstrual cycle, reproductive health, or physical symptoms will commonly constitute health information. Health information is treated as sensitive information for the purpose of the Australian Privacy Principles (APPs) under the Privacy Act, attracting obligations around collection, use, disclosure, storage, and security, and additional health record laws apply in some states and territories.
For companies operating in Australia, the APPs impose a strong expectation that personal information, particularly sensitive information, is collected directly from the individual, used only for clearly disclosed purposes, and not enriched or supplemented through third-party data sources without appropriate justification. Health information can only be collected with the consent of the individual involved, but that consent may be implied (such as by downloading an app and using a service).
Privacy compliance isn’t limited to health and other information actively entered by the user and can extend to data observed and recorded through app usage, including behavioural data capable of identifying a user across contexts. Where FemTech products analyse patterns of user behaviour to infer or theorise health status, such as through content interaction, that inferred data may warrant treatment equivalent to disclosed health information, particularly where it’s used to personalise outputs or recommendations.
Compliance with Australian privacy law may arise even where a FemTech provider is based offshore. Foreign entities can be subject to the Privacy Act where they carry on business in Australia, including where personal information is collected or processed offshore. As a result, FemTech companies supplying products into the Australian market need to consider privacy risk not only by reference to corporate location, but by how their products are accessed, promoted and used by individuals in Australia.
The growing role of AI in FemTech
The National AI Plan sets an overall economic direction and commits to keeping Australians safe through legislative and regulatory frameworks that mitigate AI harms, while emphasising that AI risk management will continue to build on Australia’s existing legal and regulatory settings.
Outside that setting, safeguards for community‑facing AI tools that provide health information or advice (but don’t fall within the scope of regulated medical devices) are still taking shape through policy work and regulator guidance, rather than through a single, FemTech-specific AI framework.
Intersection between FemTech and the Australian Consumer Law
For FemTech companies operating in Australia, these risks intersect with existing consumer protection obligations under Australian Consumer Law (ACL). This matters because the ACL can be engaged by the overall impression created for consumers in trade or commerce, including through in-app statements and outputs, not just formal marketing.
Products that incorporate AI-driven analysis of sensitive health information have to account not only for data protection compliance, but also for how algorithmic outputs are framed, qualified and how users are likely to rely on them. Where AI-generated health recommendations are presented with a degree of certainty or authority that’s unsupported by clinical evidence, exposure under misleading or deceptive conduct provisions presents a compliance risk that’s an evolving issue for the FemTech sector.
As Australian AI governance frameworks continue to develop, companies that embed algorithm transparency, bias testing, and appropriate clinical validation into their products will be better positioned to meet regulatory expectations as they arise.
Beyond market entry
Following market entry, sponsors of FemTech medical devices have ongoing responsibilities to ensure that the device continues to meet Australia’s safety and performance requirements. In practice, this includes maintaining the ARTG entry (including paying annual charges), operating effective post‑market surveillance and complaint handling processes, and reporting adverse events occurring in Australia to the TGA.
Sponsors must also be prepared to work with manufacturers and the TGA on corrective and market actions (including recalls, product corrections and safety communications), and to provide evidence that the device is safe and performs as intended if the TGA undertakes post-market review or requests supporting documentation.
Regulatory foundations for FemTech in Australia
For FemTech companies looking to enter the Australian market, it can be beneficial to confirm intended purpose and product positioning at an early stage as this framing is relevant to whether the TGA might regard software as a medical device and whether ARTG inclusion is required.
Privacy and data governance should be treated as core product considerations, particularly because compliance obligations can extend beyond user-entered information to data that’s observed or inferred through app use.
Where products incorporate AI, proportionate safeguards and careful framing of outputs can help manage risk, noting that AI-generated health insights may raise consumer law issues if presented with unwarranted certainty.
Where a business acts as the sponsor of a FemTech medical device, it’s important to factor in the operational and compliance requirements that apply following launch.
The regulatory landscape for FemTech in the US
Unlike the EU and Australia, the US doesn’t have a single, comprehensive federal privacy law governing the collection and use of personal health data outside the healthcare provider and insurer context. Instead, FemTech companies operating in the US have to navigate a patchwork of state-level privacy statutes and an increasingly active enforcement environment. The result is a regulatory landscape that’s fragmented but evolving rapidly, with particular attention now being paid to sensitive health and reproductive data.
Sensitive data under state comprehensive privacy laws
A growing number of US states have enacted comprehensive consumer privacy laws that impose heightened obligations on the processing of sensitive data, including health data.
The California Consumer Privacy Act, as amended by the California Privacy Rights Act (collectively, CCPA/CPRA), treats health data as sensitive personal information and requires businesses to limit its use, provide opt-out rights and implement appropriate safeguards. Similarly, the comprehensive privacy laws now in force in several other states classify health data – and in many cases reproductive or sexual health data specifically – as sensitive, requiring consent before processing. The sale of sensitive data triggers additional requirements which vary by state; most states require providing consumers with a right to opt out of the sale of their sensitive data, and Maryland outright bans such sales.
For FemTech companies, this means that data relating to menstrual cycles, fertility status, pregnancy, hormonal conditions and reproductive health decisions will frequently trigger the most protective tier of obligations under these state laws. The practical burden is compounded by the fact that each state statute differs in scope, defined terms, consent mechanisms and enforcement provisions, requiring careful, jurisdiction-by-jurisdiction compliance analysis.
Consumer health data laws
Beyond state comprehensive privacy laws, a distinct category of consumer health data legislation has emerged that’s directly relevant to FemTech. The Washington My Health My Data Act (MHMDA), which took effect in 2024, is the most significant example.
The MHMDA significantly expands privacy protections for consumer health data and includes a broad private right of action. MHMDA applies to consumer health information that isn’t covered under laws such as the Health Insurance Portability and Accountability Act (HIPAA), it applies to data that extends far beyond traditional “medical” information, and it provides for much more onerous requirements.
The MHMDA defines consumer health data expansively to include information that identifies or is reasonably linkable to a consumer and that identifies the consumer’s past, present or future physical or mental health status. This expressly encompasses reproductive and sexual health information, gender-affirming care data, biometric data and information derived or inferred from non-health data.
Regulated entities under the MHMDA have to maintain a “consumer health data privacy policy” that’s prominently linked from the entity’s website homepage and which contains certain required elements.
Regulated entities are prohibited from collecting and sharing any consumer health data without consent from the consumer, unless the collection of the consumer health data is for the purposes necessary to provide the consumer with a requested product or service. To adequately obtain consent, the consumer has to be informed of:
- the categories of consumer health data being collected or shared;
- the purpose of the collection or sharing of the consumer health data;
- the ways in which consumer health data will be used;
- the categories of entities with whom the consumer health data is shared; and
- how the consumer can withdraw consent.
Further, consent to share consumer health data must be separate and distinct from the consent to collect. Additional prescriptive requirements, in the form of a “valid authorization,” are required as a precondition for the sale of consumer health data.
Nevada has enacted a similar consumer health data privacy law, and other states are considering comparable legislation. For FemTech companies, these statutes represent a significant compliance obligation because they capture health-related data processing that falls outside the scope of HIPAA and may apply regardless of whether the company has a physical presence in the enacting state.
Cybersecurity and breach notification for health and sensitive data
Cyber risk is acute in FemTech given the sensitivity of reproductive and health data. Covered entities and business associates must meet the HIPAA Security Rule’s safeguards. For many consumer health apps outside HIPAA, the Federal Trade Commission’s (FTC’s) Health Breach Notification Rule (discussed below) can apply to breaches of unsecured, individually identifiable health information. Independently, Section 5 of the FTC Act has been used to police unreasonable data security practices and deceptive privacy or security representations. State privacy statutes and general data breach laws add obligations around reasonable security, incident response, and notifications to individuals and state authorities.
Practical controls expected by regulators include security‑by‑design, access controls and authentication, data minimisation, encryption in transit and at rest appropriate to risk, vulnerability and patch management, secure development practices for mobile and cloud, third‑party risk management, and data segregation for sensitive features (eg reproductive health modules).
Regulatory focus on health and reproductive data
US regulators have shown increasing interest in the collection, use and sharing of health and reproductive data, particularly following the US Supreme Court’s decision in Dobbs v Jackson Women’s Health Organization in 2022. The FTC has brought several high-profile enforcement actions against health and wellness apps for deceptive or unfair practices involving consumer health data.
The FTC has used its enforcement authority under the Health Breach Notification Rule (HBNR) in some such cases. The HBNR requires that a “vendor of personal health records” and certain related entities notify affected individuals following discovery of a breach of security of the individuals’ unsecured “PHR identifiable health information.” The HBNR also requires notice to the FTC and the media (in some cases) if a breach occurs. The FTC has made clear that a breach of security isn’t limited to cybersecurity intrusions and includes incidents of unauthorised access, including sharing covered information with third parties without an individual’s authorisation (such as by firing third-party cookies without affirmative express consent).
The FTC has signalled that it views the mishandling of reproductive and health data as a priority enforcement area, issuing public guidance warning companies against over-collection of sensitive health information and deceptive privacy practices. State attorneys general have also increased enforcement activity in this space, reflecting a broader regulatory consensus that reproductive and sexual health data warrants heightened protection. For FemTech companies, this regulatory environment underscores the importance of ensuring that privacy representations are accurate, that data-sharing practices with third parties are transparent and limited and that consent mechanisms are meaningful rather than perfunctory.
The use and risks of AI in FemTech
Many FemTech products increasingly rely on AI and machine learning to deliver core functionality, from predictive fertility analytics and symptom assessment to personalised health recommendations. In the US, the use of AI in consumer health products raises distinct regulatory risks. The FTC has stated that it will use its existing authority under Section 5 of the FTC Act to challenge unfair or deceptive AI-driven practices, including the use of biased algorithms, opaque decision-making processes and unsubstantiated health-related claims generated by AI.
Navigating the US landscape
The US regulatory environment for FemTech is characterised by a decentralised but increasingly aggressive enforcement posture, particularly as related to the collection, processing, and sharing of sensitive or health-related data. In the absence of a single federal privacy framework, FemTech companies have to contend with a growing web of state privacy laws, consumer health data statutes, FTC enforcement priorities and emerging AI governance requirements.
In light of the heightened regulatory sensitivity to reproductive and health data, building a compliance strategy cannot be deferred. Companies entering or operating in the US market should treat privacy and consumer protection as integrated, foundational elements of product design and commercial strategy.
This article is part of the FemTech Now series. Access the hub here.