The new article highlights the aspects related to applications for provisional admission for testing.

medical device regulations in Germany

The German regulating authority in the sphere of healthcare products (BfArM) has published a guidance document dedicated to digital health applications (DiGA).

The document provides an overview of the applicable regulatory requirements, as well as additional clarifications and recommendations to be taken into consideration by medical device manufacturers (software developers) and other parties involved in order to ensure compliance thereto. At the same time, provisions of the guidance are non-binding in their legal nature, nor are they intended to introduce new rules or impose new obligations.

The authority also reserves the right to make changes to the guidance and recommendations provided therein, should such changes be reasonably necessary to reflect corresponding amendments to the underlying legislation.

Justification for Improving Care and Systematic Data Evaluation

According to the guidance, DiGA manufacturers must demonstrate the potential Versorgungseffekte (pVE) of their product for a specific patient group through a comprehensive data evaluation process.

This involves a thorough literature review and the integration of data collected via the DiGA.

The aim is to provide preliminary insights to support the study planned for the test phase.

The evaluation should focus on intervention effects, including considerations like case numbers, recruitment methods, and measurement tools.

  • Character
    The data evaluation should clearly demonstrate the likelihood of the study successfully addressing the research question.
    It should hint at the clinical relevance of the anticipated study results, particularly if a moderate novelty (mN) is proposed.
    The evaluation should be submitted as a separate document, offering a sufficient justification for care improvement.
  • Embedding in the Evaluation Concept
    The data evaluation summary is integral to the evaluation concept, linking the literature review, DiGA data, and study endpoints.
  • Relation to the Planned Study
    The evaluation should focus on the same patient population and endpoints as the planned study. Differences must be justified to ensure transferability.
  • Scope
    The extent of the data evaluation depends on various factors, including the clinical picture and care realities.
    The authority additionally emphasizes the importance of robustness and plausibility are essential, especially in cases of high placebo responses.
  • Robustness
    Clinical relevance may be deemed plausible even with clinically relevant effects rather than statistical significance. Results should be robust, showing a predefined statistical trend.
  • Dropouts: It’s crucial to transparently report dropouts and their reasons, possibly through a flow chart. High dropout rates must be contextualized with the clinical picture.
  • Language Version
    If the evaluation used a non-German version of DiGA, similarities with the German version must be demonstrated.
  • Data from Abroad
    If data collection occurred outside Germany, the comparability of care situations must be shown.
    As it was mentioned in previous articles dedicated to the matter, the authority pays special attention to ensuring proper operations of DiGA within the existing clinical environment.
  • Observation Period and Prescription Period
    Justifications are needed if the observation period in the data evaluation is shorter than the planned prescription period.

The number of patients in the study depends on the clinical picture and the robustness of the results. High dropout rates could cast doubt on the care improvement validity, regardless of patient numbers.

FDA Guidance on Distinguishing Medical Device Recalls from Enhancements: Key Concepts and Definitions

Evaluation Concept

In addition to systematic data evaluation, manufacturers must submit an evaluation concept adhering to recognized scientific standards.

This includes a detailed study protocol with a statistical analysis plan (SAP), addressing such elements as randomization, sample size, endpoint definitions, and handling of missing data.

The concept should be developed by a manufacturer-independent scientific institute without conflicts of interest.

The methodology should clearly demonstrate the pVE. Guidelines from various scientific bodies can be referenced in developing the evaluation concept.

Extension of the Testing Period

The testing period of up to twelve months can be extended by another twelve months. Manufacturers must justify the need for an extension, showing why the required evidence couldn’t be produced within the initial period and how it will be achieved with the extension.

Extensions are granted only once and require early application.

The BfArM may reject extension requests if the need for an extension is not justified properly or the manufacturer fails to credibly show that successful evidence will be produced within the extended period.

Conclusion

In summary, the present guidance issued by BfArM outlines the key aspects of DiGA testing applications, systematic data evaluation, and the criteria for extending the testing period.

It reflects the complexity and stringent requirements set by regulatory bodies in evaluating digital health applications.

How Can RegDesk Help?

RegDesk is a holistic Regulatory Information Management System that provides medical device and pharma companies with regulatory intelligence for over 120 markets worldwide. It can help you prepare and publish global applications, manage standards, run change assessments, and obtain real-time alerts on regulatory changes through a centralized platform. Our clients also have access to our network of over 4000 compliance experts worldwide to obtain verification on critical questions. Global expansion has never been this simple.