On June 17, 2025, the European Commission released a significant update to MDCG 2019-11, the foundational guidance on the Qualification and Classification of Software under the EU Medical Device Regulation (MDR 2017/745) and In Vitro Diagnostic Regulation (IVDR 2017/746). This new version, Rev. 1, provides much-needed clarity on how software, especially AI-driven tools, should be assessed under the evolving regulatory framework.
As software continues to play a crucial role in healthcare, this update brings sharper guidance for manufacturers, software developers, and Notified Bodies. Whether you’re building a diagnostic app, an embedded algorithm in a device, or a machine learning tool for patient triage, this revision may directly impact how your product is classified and regulated in the EU.
Table of Content
Purpose of the Updated
The revised guidance addresses key regulatory questions facing today’s MedTech industry:
- When does software qualify as a medical device under MDR or IVDR?
- How should AI/ML tools be treated during regulatory evaluation?
- What classification rules apply to different types of medical software?
In addition to refining existing criteria, MDCG 2019-11 Rev. 1 also introduces new tools and examples to help stakeholders apply these rules consistently as the EU shifts toward a more sophisticated oversight of Medical Device Artificial Intelligence (MDAI).
Key Concepts Clarified
Qualification: Is Your Software a Medical Device?
Qualification refers to determining whether a software product falls within the scope of MDR or IVDR. The central consideration is intended purpose; does the software serve a medical function such as diagnosis, prevention, or monitoring?
The guidance offers new examples to distinguish between:
- Medical software (e.g., a cardiac risk prediction app)
- Non-medical software (e.g., general wellness apps without a clinical claim
MDAI: Acknowledging the Rise of Medical AI
Rev. 1 introduces the term Medical Device Artificial Intelligence (MDAI) to refer to software that uses AI/ML for medical purposes. These tools are now explicitly included in the qualification framework.
Manufacturers must evaluate:
- Whether the AI’s outputs influence clinical decisions
- How to demonstrate the algorithm’s reliability and transparency
- What performance monitoring is in place post-market
SaMD vs. Embedded Software
The update distinguishes between:
- Software as a Medical Device (SaMD): Standalone applications with a medical function.
- Embedded Software: Software that is integral to the function of a physical medical device.
This distinction affects not only classification, but also post-market obligations and cybersecurity planning.
Notable Changes in the June 2025 Revision
Expanded AI/ML Examples
The updated guidance includes practical use cases of AI in healthcare, from radiological image analysis to decision support. It also highlights the importance of ongoing monitoring of algorithm performance in real-world settings.
Updated Decision Trees
To simplify the evaluation process, the qualification and classification flowcharts have been redesigned for greater usability. These tools help guide regulatory decisions step-by-step and ensure consistency across submissions.
Emphases on Intended Purpose
The updated guidance reiterates that intended use is the cornerstone of both qualification and classification. The clearer and more specific this statement, the easier it is to justify the regulatory pathway.
Cybersecurity and Real-World Performance
The new guidance encourages manufacturers to better integrate cybersecurity measures and post-market performance tracking, especially for AI-based tools. As digital health continues to evolve, regulators expect continuous risk management, not just static design files.
Practical Guidance for Manufacturers
Here are five actions manufacturers should take based on the updated MDCG 2019-11 Rev. 1:
- Define intended use precisely, especially when AI or predictive models are involved
- Apply MDR Rule 11 for MDSW classification with traceable rationale
- Align classification logic with updated MDCG decision trees
- Ensure algorithm transparency, document logic, explainability, and updates
- Prepare for reclassification when significant changes are made to the software
Related Documents and Tools
- MDCG 2019-11 Rev. 1: Updated software guidance
- MDCG 2023-5: AI in Medical Devices
- MDCG 2021-24: Classification under IVDR
- EU AI Act (2024): Implications for medical software and high-risk systems
Implications for Stakeholders
Stakeholder | Safety Challenges |
Manufacturers | Must reassess software portfolios, especially AI tools. Update documentation and risk justifications. |
Notified Bodies | Will increase scrutiny of software logic, intended use, and AI validation evidence. |
Software Developers | Need to integrate MDR/IVDR compliance into dev cycles. Traceability and post-market plans are essential. |
Conclusion
MDCG 2019-11 Rev. 1 is more than an administrative update, it reflects the EU’s evolving understanding of digital health innovation. As AI/ML tools become more advanced and integrated into clinical workflows, regulatory clarity is essential to ensure patient safety, trust, and market success.
For MedTech manufacturers and developers, this is an opportunity to build smarter, safer, and more compliant solutions and stay ahead of shifting global expectations.
Author: Taylor Esser