Monitoring progress of digital health use and uptake in Europe

Approaches, challenges and recommendations

Date of creation or latest update
Date: 3 September 2024
Lead authors: Maike Hentges, Lucas Deimel, Anja Hirche, empirica Communication and Technology Research

In recent years, there has been considerable progress in assessing the state-of-play in digital health use and uptake for national and cross-border scenarios. However, developing and capturing standardised metrices across diverse country settings remains challenging. Adaptable and iterative monitoring frameworks, early stakeholder engagement, and robust data validation are key contributors to successful national or cross-country monitoring efforts. Institutionalising monitoring procedures and ensuring inclusivity towards disadvantaged populations are crucial to facilitate equity in digital health accessibility and uptake. Sustained political commitment, investment, and (international) collaboration for digital health monitoring are necessary for long-term success, particularly in light of the European Health Data Space (EHDS).

Monitoring the use and uptake of digital health solutions and services by end-users is critical to the realisation of yet unused potentials in the enhancement of healthcare provision, patient outcomes, and health systems strengthening and a successful digital transformation. Mechanisms to identify gaps or weaknesses and to track progress in implementation, organisational or system readiness allow policymakers to develop appropriate strategies to overcome challenges and enhance facilitators for uptake. Such mechanisms also support the allocation and use of resources (human, financial, technological) to the identified gaps and target areas of high expected impact.

DigitalHealthUptake (DHU) provides the European DHU Radar (DigitalHealthUptake. (2024). European Digital Health Uptake Radar. [online] Available at: Link) as an open instrument for collecting, monitoring and analysing the uptake of digital health practices (digital health solutions, services, strategies, methods and tools for facilitating uptake, etc.) across Europe. Radar users can provide information on the scale of uptake and available evidence supporting their digital health practice, for instance in terms of health outcomes, economic value to patients or healthcare systems. The Radar monitoring framework and repository were jointly developed by DHU consortium partners and feed into another key activity of the project, namely the collection of tools and methods for supporting uptake, including instruments for measuring the extent of uptake. By analysing the digital health practices submitted on the Radar, DHU aims to consolidate the results with an overview of available practices, including indicator tools and toolkits to measure solutions’ uptake or maturity.

Based on these activities as well as partners’ experiences in European digital health monitoring and benchmarking initiatives, this policy brief highlights some of the challenges and key considerations in monitoring progress in digital health use and uptake.

Radar
Work with documents

In its 2022 report on “Monitoring the Implementation of Digital Health”, the World Health Organization (WHO) Regional Office for Europe identified a lack of systematic monitoring mechanisms at national level for the implementation and use of digital health interventions, despite the importance of collecting such data for informing digital health policy-making (World Health Organization. (2022). Monitoring the implementation of digital health: an overview of selected national and international methodologies. Copenhagen: World Health Organization Regional Office for Europe. [pdf] Available at: Link). The WHO concluded that there was a critical need for monitoring activities with comparable indicators and terminologies within and across countries, increased efforts for knowledge and information sharing, and making digital health progress more visible in national statistics (World Health Organization. (2022). Monitoring the implementation of digital health: an overview of selected national and international methodologies. Copenhagen: World Health Organization Regional Office for Europe. [pdf] Available at: Link).

In recent years, several European countries as well as the European Commission and international organisations launched monitoring initiatives and benchmarks at national and EU levels in the context of healthcare digitalisation. Some of these initiatives include:

  • Nordic eHealth survey (2023) (Eriksen et al. (2023). A Nordic survey to monitor citizens use and experiences with eHealth. [online] Available at: Link);
  • Monitoring the use of Italy’s Electronic Health Record (EHR) and related digital health services (2024) (Department for Digital Transformation. (2024). The Electronic Health Record 2.0. [online] Available at: Link);
  • Dutch eHealth monitor (2021-2023) (RIVM. (2022). E-health monitor 2021-2023. [online] Available at: Link);
  • European Commission Digital Decade eHealth Indicator study (2023 & 2024) (European Commission et al. (2023). Digital decade e-Health indicators development – Final report. Publications Office of the European. [online] Available at: Link);
  • OECD “Health at a glance” digital health indicators (2023) (OECD. (2023). Health at a glance. [online] Available at: Link);
  • Global Digital Health Monitor (2023) (Global Digital Health Monitor. (2023) The state of digital health 2023. [pdf] Available at: Link).

 

Commonly used dimensions for monitoring digital health developments at national or European levels include but are not limited to:

  • Strategy and governance – e.g., availability of national digital health strategy / action plan and dedicated resources for implementation;
  • Stakeholders and policies – e.g., involved organisations, role of veto players, mandates and responsibilities;
  • Interoperability – e.g., policies to encourage use of common technical and semantic standards;
  • Extent of uptake – e.g., percentage of primary, secondary, tertiary care providers in public / private sector using EHRs, percentage of citizens that have accessed their personal health information online;
  • Electronic health data use and exchange, e.g., types of electronic health data such as medical reports available and exchanged via electronic access services;
  • Awareness – e.g., percentage of citizens that know where to find their personal health information online;
  • Inclusion – e.g., policies or regulations to ensure compliance of digital health services with web content accessibility requirements;
  • User experience – e.g., level of satisfaction by patients / healthcare professionals with national / regional digital health services.

The specific aspects each monitoring framework covers also depends on the level of digital health maturity in the respective national context. Previous monitoring efforts have shown that capturing the use and uptake of digital health often involves complex metrics that are difficult to standardise and compare. Multiple methods or frameworks exist that can be applied for different assessment purposes. Each framework faces different challenges, bottlenecks and considerations during development and implementation.
In the rapidly changing digital health landscape, monitoring mechanisms and frameworks need to be adaptable to advancements and dynamic landscapes. Therefore, regular reviews, fine-tuning and updates are critical to ensure that monitoring frameworks and metrics remain “fit for purpose” to accurately reflect latest developments. This will also require a good understanding of the evolving digital health market and related changes in legislative and regulatory frameworks at national and EU levels.

Following an iterative approach for developing a monitoring framework can avoid ambiguity and help ensure that elements are well defined and relevant for its purpose. Particularly monitoring initiatives at international levels may face differences in semantics (e.g., Electronic Health Record / Electronic Medical Record / Electronic Patient Record / Personal Health Record) that need to be identified early in the process to avoid later misunderstandings among implementers and recipients as to what is being asked. As a starting point, building a common understanding around definitions and concepts of the monitoring framework is essential. From there, concrete indicators (and sub-indicators) can be formulated and operationalised, for instance, in survey questionnaires. Sanity checks of the monitoring framework, methodology and mechanism with internal and external stakeholders will further improve quality and validity.

Policy brief monitoring 3 web

For the engagement of both implementers and recipients of surveys / questionnaires, it is pivotal to strike a balance between the level of detail and feasibility of the monitoring exercise. In terms of feasibility, this is especially true for monitoring activities that face a higher data collection burden, i.e., relying on multiple data sources from various national or regional bodies / agencies to represent an accurate national picture. Since engaging stakeholders with different priorities and resources is challenging, priming recipients early for upcoming monitoring and reserving enough time for data acquisition from several sources can facilitate a smoother data collection and validation process. Allocating time and resources to thorough data validation is necessary for ensuring completeness, accuracy and quality of data. In cross-country comparisons, it is important to recognise that data collection methodologies and resulting availability of quantitative or qualitative information to feed monitoring indicators can vary significantly between countries (e.g., surveys, log data, national repositories).

Regarding the level of detail, cross-country analyses need to be designed in a way that the intricacies of healthcare systems are adequately reflected in a way that still allows meaningful cross-country comparisons (e.g., level of decentralisation; Bismarck vs Beveridge model). This can mean, for instance, that monitoring progress on the uptake and use of digital health solutions and services by healthcare providers clearly distinguishes between public and private sectors, considering that one type of healthcare provider group can be public or private in one country but entirely public in another.

Institutionalising monitoring procedures as part of digital health policy-making and planning will create a space that equips decision-makers with the opportunity to pre-emptively adapt and improve to new circumstances. In this regard, potential negative developments need to be monitored closely and addressed proactively, for instance in terms of widening digital divides that hinder equitable use and uptake of digital health solutions and services. Monitoring frameworks to identify disparities in access and use need to be inclusive towards disadvantaged populations. Identifying factors that affect end-users’ uptake of digital health solutions and services such as socioeconomic barriers or cultural aspects can be very informative as a follow-up to a general progress monitoring.

Monitoring progress repeatedly over time entails long-term, continuous effort. To ensure sustainability and fill the previously identified gaps in systematic and comparable national monitoring mechanisms, political commitment, greater investments and dedicated resources as well as international collaboration are needed. This will become even more relevant in light of the upcoming European Health Data Space (EHDS) Regulation, where a stronger evidence base on countries’ state-of-play and mapping of good practices to facilitate primary and secondary use of health data within and across borders will be important to shape digital health developments in Europe.

Recommendations
  • Adapt: Regularly review and refine the monitoring framework to stay current with developments.
  • Iterate: Develop frameworks and methodologies iteratively, starting with common definitions and concepts.
  • Engage: Engage stakeholders early for smoother data collection and validation.
  • Cross-Country Analysis: Design analyses to account for healthcare system differences and enable meaningful comparisons.
  • Inclusivity: Address disparities in digital health access, focusing on disadvantaged populations.
  • Commit: Ensure long-term progress monitoring through more political commitment, investments, and collaboration.

DISCLAIMER
Views and opinions expressed are those of the author(s) only and do not necessarily reflect those of DG CONNECT, European Commission. Neither the European Union nor the granting authority can be held responsible for them.