Submitted:
02 June 2025
Posted:
03 June 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Summary of the Guidelines
Key assumptions
- i.
- Similarity of studies with respect to effect modifiers.
- ii.
- Homogeneity of relative treatment effects across trials comparing the same interventions; and
- iii.
- Consistency between direct and indirect comparisons within an evidence network.
Direct Comparisons
Indirect treatment comparisons
- 1.
- Anchored indirect comparisons
- 2.
- Population-Adjusted Indirect Comparisons
- 3.
- Unanchored comparisons and non-randomized evidence
3. Critical Review
- 1.
- Ambiguity in the role of indirect evidence alongside direct comparisons
- 2.
- Overly restrictive interpretation of assumption violations
- 3.
- Population-Adjusted ITCs: overcautious appraisal
- 4.
- Meta-regression vs. subgroup analyses: misplaced prioritization
- 5.
- Underappreciation of Bayesian methods
- 6.
- Unanchored comparisons and non-randomized evidence: an overly restrictive stance
- 7.
- Overlooked considerations in living evidence synthesis
- 8.
- Support for advanced time-to-event (TTE) methods
4. Implications for JCA Submissions
5. Conclusions
- Provide clearer support for the integration of direct and indirect evidence (i.e. “mixed” treatment comparisons);
- Allow for appropriately adjusted analyses in the presence of assumption violations;
- Guide, rather than restrict, the cautious use of unanchored comparisons when no alternatives exist, including more guidance on the use of real-world evidence and ECAs;
- Enhance support for Bayesian methods: the HTACG may have given more attention to frequentists methods assuming that assessors are more familiar with them, although these guidelines provide in fact an opportunity to help assessors understand the value of Bayesian methods;
- Incorporate operational guidance that reflects the complexity and diversity of real-world submissions; and
- Propose a framework for grading the strength of indirect evidence.
- Text box 1: Key limitations identified in the EU HTA Guidelines on quantitative evidence synthesis
- ∙
- Overly conservative stance on assumption violations, risking blanket rejection of indirect treatment comparisons (ITCs)
- ∙
- Ambiguous guidance on combining direct and indirect evidence, with no clear support for mixed treatment comparisons
- ∙
- Restrictive preference for subgroup analyses over meta-regression, despite lower statistical power and greater risk of false positives
- ∙
- Limited endorsement of population-adjusted methods (e.g., MAIC, STC), despite their relevance when effect modifiers are imbalanced
- ∙
- Minimal consideration of Bayesian approaches, despite their strengths, particularly in sparse or rare-event settings
- ∙
- Rigid dismissal of unanchored comparisons, overlooking recent advances in causal inference frameworks and evolving methods to quantify biases
- ∙
- No indication that recommendations are grounded in empirical validation or simulation studies
- Text box 2: Suggested improvements for the EU HTA guidelines on indirect treatment comparisons
- ∙
- Reconsider the guideline’s advice against using indirect evidence when direct comparisons exist, as this restriction lacks empirical or methodological justification
- ∙
- Consider the wider use of Bayesian methods, which are well suited to quantitative evidence synthesis; provide readers with the knowledge required to engage with Bayesian approaches; reassess the appropriateness of combining Bayesian and frequentist analyses within the same dossier (currently allowed by the guidelines)
- ∙
- Offer balanced recommendations on population-adjusted methods (e.g., MAIC, STC), acknowledging their relevance as primary analyses in many situations
- ∙
- Explicitly acknowledge that all evidence has limitations, and propose a framework for grading the certainty of indirect evidence (e.g., in the manner of GRADE) to support decisions under imperfect evidence, potentially incorporating quantitative bias analysis
- ∙
- Expand guidance on leveraging real-world evidence and external control arms, including the adoption of emerging methods such as target trial emulation
- ∙
- Consider the practical feasibility of the guidelines, recognizing the limited time and resources available to both health technology developers and national assessors
- ∙
- Ground methodological recommendations in a systematic review of empirical validation studies and simulation research, ensuring that guidance reflects best available evidence
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| CI | Confidence Interval |
| DSU | Decision Support Unit (of NICE) |
| Δ (Delta) | Symbol used for a threshold in hypothesis testing |
| EBM | Evidence-Based Medicine |
| ECA | External Control Arm |
| EMA | European Medicines Agency |
| EU | European Union |
| FDA | United States Food and Drug Administration |
| GRADE | Grading of Recommendations, Assessment, Development and Evaluation |
| H0 | Null Hypothesis |
| HAS | Haute Autorité de Santé (France) |
| HCP | Health Care Professional |
| HTA | Health Technology Assessment |
| HTACG | Health Technology Assessment Coordination Group (of the EU) |
| HTD | Health Technology Developer |
| IPD | Individual Patient Data |
| IQWiG | Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen (Germany) |
| ISPOR | International Society for Pharmacoeconomics and Outcomes Research |
| ITC | Indirect Treatment Comparison |
| JCA | Joint Clinical Assessment |
| MAIC | Matching-Adjusted Indirect Comparison |
| ML-NMR | Multilevel Network Meta-Regression |
| MS | Member States (of the EU) |
| NICE | National Institute for Health and Care Excellence |
| NMA | Network Meta-Analysis |
| PAIC | Population-Adjusted Indirect Comparison |
| PH | Proportional Hazards |
| PICO | Population, Intervention, Comparator, Outcome |
| RCT | Randomized Controlled Trial |
| RMST | Restricted Mean Survival Time |
| SAT | Single-Arm Trial |
| SLR | Systematic Literature Review |
| STC | Simulated Treatment Comparison |
| TSD | Technical Support Document (from NICE DSU) |
| TTE | Time-To-Event |
References
- European Commission. Regulation (EU) 2021/2282 of the European Parliament and of the Council of 15 December 2021 on health technology assessment and amending Directive 2011/24/EU. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32021R2282 (accessed on 18 December 2024).
- The Member State Coordination Group on Health Technology Assessment (HTACG). Guidance on the validity of clinical studies for joint clinical assessments. V1.0. Available online: https://health.ec.europa.eu/document/download/9f9dbfe4-078b-4959-9a07-df9167258772_en?filename=hta_clinical-studies-validity_guidance_en.pdf (accessed on 30 December 2024).
- Hollard, D., G. Roberts, I. Taylor, J. Gibson, and O. Darlington. HTA77 PICO Consolidation in European HTA Scoping: Examining PICO Variations in Oncology Drugs in the Context of the European Joint Clinical Assessment. Value Health 2024, 27, S258.
- Young, K. and I. Staatz. HTA111 Population, Intervention, Comparator, Outcomes (PICO) of ATMPs and Potential Impact on the Upcoming EU Regulation on HTA. Value Health 2023, 26, S340.
- van Engen, A., R. Kruger, J. Ryan, and P. Wagner. HTA97 impact of additive PICOs in a European joint health technology assessment. A hypothetical case study in lung cancer. Value Health 2022, 25, S315.
- The Member State Coordination Group on Health Technology Assessment (HTACG). Guidance on the scoping process. Available online: https://health.ec.europa.eu/document/download/7be11d76-9a78-426c-8e32-79d30a115a64_en?filename=hta_jca_scoping-process_en.pdf (accessed on 2 May 2025).
- van Engen, A., R. Krüger, A. Parnaby, M. Rotaru, J. Ryan, D. Samaha, and D. Tzelis. The impact of additive population (s), intervention, comparator (s), and outcomes in a European joint clinical health technology assessment. Value Health 2024, 27, 1722-1731.
- The Member State Coordination Group on Health Technology Assessment (HTACG). PICO exercises. Available online: https://health.ec.europa.eu/publications/pico-exercises_en (accessed on 26 February 2025).
- Dias, S., A.J. Sutton, A.E. Ades, and N.J. Welton. Evidence Synthesis for Decision Making 2. Med. Decis. Making 2013, 33, 607-617.
- Hoaglin, D.C., N. Hawkins, J.P. Jansen, D.A. Scott, R. Itzler, J.C. Cappelleri, C. Boersma, D. Thompson, K.M. Larholt, M. Diaz, and A. Barrett. Conducting Indirect-Treatment-Comparison and Network-Meta-Analysis Studies: Report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: Part 2. Value Health 2011, 14, 429-437.
- The Member State Coordination Group on Health Technology Assessment (HTACG). Methodological Guideline for Quantitative Evidence Synthesis: Direct and Indirect Comparisons. Available online: https://health.ec.europa.eu/document/download/4ec8288e-6d15-49c5-a490-d8ad7748578f_en?filename=hta_methodological-guideline_direct-indirect-comparisons_en.pdf (accessed on 8 March 2025).
- The Member State Coordination Group on Health Technology Assessment (HTACG). Practical Guideline for Quantitative Evidence Synthesis: Direct and Indirect Comparisons. Available online: https://health.ec.europa.eu/document/download/1f6b8a70-5ce0-404e-9066-120dc9a8df75_en?filename=hta_practical-guideline_direct-and-indirect-comparisons_en.pdf (accessed on 8 March 2025).
- Macabeo, B., A. Quenéchdu, S. Aballéa, C. François, L. Boyer, and P. Laramée. Methods for indirect treatment comparison: results from a systematic literature review. Journal of market access & health policy 2024, 12, 58-80.
- Ahn, E. and H. Kang. Concepts and emerging issues of network meta-analysis. Korean J. Anesthesiol. 2021, 74, 371-382.
- Institute of Medicine (US) Committee on Standards for Systematic Reviews of Comparative Effectiveness Research. Finding What Works in Health Care: Standards for Systematic Reviews. Washington (DC): National Academies Press (US); 2011. 4, Standards for Synthesizing the Body of Evidence. Available online: https://www.ncbi.nlm.nih.gov/books/NBK209522/ (accessed on 8 May 2025).
- Guo, J.D., A. Gehchan, and A. Hartzema. Selection of indirect treatment comparisons for health technology assessments: a practical guide for health economics and outcomes research scientists and clinicians. BMJ open 2025, 15, e091961.
- Song, F., Y.K. Loke, T. Walsh, A.-M. Glenny, A.J. Eastwood, and D.G. Altman. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews. BMJ 2009, 338.
- Cochrane Training. Chapter 10: Analysing data and undertaking meta-analyses. Available online: https://training.cochrane.org/handbook/current/chapter-10 (accessed on 12 May 2025).
- Bucher, H.C., G.H. Guyatt, L.E. Griffith, and S.D. Walter. The results of direct and indirect treatment comparisons in meta-analysis of randomized controlled trials. J. Clin. Epidemiol. 1997, 50, 683-691.
- Yu-Kang, T. Node-Splitting Generalized Linear Mixed Models for Evaluation of Inconsistency in Network Meta-Analysis. Value Health 2016, 19, 957-963.
- Veroniki, A.A., S. Tsokani, I.R. White, G. Schwarzer, G. Rücker, D. Mavridis, J.P. Higgins, and G. Salanti. Prevalence of evidence of inconsistency and its association with network structural characteristics in 201 published networks of interventions. BMC Med. Res. Methodol. 2021, 21, 1-10.
- European Federation of Statisticians in the Pharmaceutical Industry (EFSPI). Unanchored indirect treatment comparison methods and unmeasured confounding. Available online: https://psiweb.org/docs/default-source/default-document-library/psi-hta-sig-itc-kr_final.pdf?sfvrsn=481bacdb_0 (accessed on 13 May 2025).
- Chaimani, A., D.M. Caldwell, T. Li, J.P. Higgins, and G. Salanti. Undertaking network meta-analyses. Cochrane handbook for systematic reviews of interventions 2019, 285-320.
- Phillippo, D., T. Ades, S. Dias, S. Palmer, K.R. Abrams, and N. Welton. NICE DSU technical support document 18: methods for population-adjusted indirect comparisons in submissions to NICE. 2016.
- Sutton, A.J., K.R. Abrams, D.R. Jones, T.A. Sheldon, and F. Song, Methods for meta-analysis in medical research. Vol. 348. 2000: Wiley Chichester.
- Dias, S., A.J. Sutton, N.J. Welton, and A. Ades. Heterogeneity: subgroups, meta-regression, bias and bias-adjustment. 2016.
- Mbuagbaw, L., B. Rochwerg, R. Jaeschke, D. Heels-Andsell, W. Alhazzani, L. Thabane, and G.H. Guyatt. Approaches to interpreting and choosing the best treatments in network meta-analyses. Systematic reviews 2017, 6, 1-5.
- Higgins, J.P., S.G. Thompson, and D.J. Spiegelhalter. A re-evaluation of random-effects meta-analysis. Journal of the Royal Statistical Society Series A: Statistics in Society 2009, 172, 137-159.
- Signorovitch, J.E., V. Sikirica, M.H. Erder, J. Xie, M. Lu, P.S. Hodgkins, K.A. Betts, and E.Q. Wu. Matching-adjusted indirect comparisons: a new tool for timely comparative effectiveness research. Value Health 2012, 15, 940-947.
- Sackett, D.L., W.M. Rosenberg, J.M. Gray, R.B. Haynes, and W.S. Richardson, Evidence based medicine: what it is and what it isn’t. 1996, British Medical Journal Publishing Group. p. 71-72.
- Guyatt, G.H., A.D. Oxman, G.E. Vist, R. Kunz, Y. Falck-Ytter, P. Alonso-Coello, and H.J. Schünemann. GRADE: an emerging consensus on rating quality of evidence and strength of recommendations. BMJ 2008, 336, 924-926.
- Macabeo, B., T. Rotrou, A. Millier, C. François, and P. Laramée. The acceptance of indirect treatment comparison methods in oncology by health technology assessment agencies in England, France, Germany, Italy, and Spain. PharmacoEconomics-open 2024, 8, 5-18.
- Es-Skali, I.J. and J. Spoors. Analysis of indirect treatment comparisons in national health technology assessments and requirements for industry submissions. Journal of Comparative Effectiveness Research 2018, 7, 397-409.
- The independent Institute for Quality and Efficiency in Health Care (IQWiG). General Methods version 8.0. Available online: https://www.iqwig.de/methoden/allgemeine-methoden_entwurf-fuer-version-8-0.pdf (accessed on 26 February 2025).
- Haute Autorité de Santé (HAS). Indirect comparisons: Methods and validity Available online: https://www.has-sante.fr/upload/docs/application/pdf/2011-02/summary_report__indirect_comparisons_methods_and_validity_january_2011_2.pdf (accessed on 12 May 2025).
- The Belgian Health Care Knowledge Centre (KCE). Guidelines for pharmacoeconomic evaluations in Belgium Available online: https://kce.fgov.be/sites/default/files/2021-12/d20081027327.pdf (accessed on 12 May 2025).
- Chung, W.T. and K.C. Chung. The use of the E-value for sensitivity analysis. J. Clin. Epidemiol. 2023, 163, 92-94.
- Thompson, S.G. and J.P. Higgins. How should meta-regression analyses be undertaken and interpreted? Stat. Med. 2002, 21, 1559-1573.
- Jansen, K. and H. Holling. Rare events meta-analysis using the Bayesian beta-binomial model. Research Synthesis Methods 2023, 14, 853-873.
- The Member State Coordination Group on Health Technology Assessment (HTACG). Guidance on reporting requirements for multiplicity issues and subgroup, sensitivity and post hoc analyses in joint clinical assessments. Available online: https://health.ec.europa.eu/document/download/f2f00444-2427-4db9-8370-d984b7148653_en?filename=hta_multiplicity_jca_guidance_en.pdf (accessed on 8 January 2025).
- Turner, R.M., D. Jackson, Y. Wei, S.G. Thompson, and J.P. Higgins. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis. Stat. Med. 2015, 34, 984-998.
- Sutton, A.J. and K.R. Abrams. Bayesian methods in meta-analysis and evidence synthesis. Stat. Methods Med. Res. 2001, 10, 277-303.
- Ren, S., S. Ren, N.J. Welton, and M. Strong. Advancing unanchored simulated treatment comparisons: A novel implementation and simulation study. Res Synth Methods 2024, 15, 657-670.
- Nierengarten, M.B. Single-arm trials for US Food and Drug Administration cancer drug approvals: Although there are some challenges in using single-arm studies for accelerated drug approvals, it can make a difference in getting drugs previously approved for other uses to patients.: Although there are some challenges in using single-arm studies for accelerated drug approvals, it can make a difference in getting drugs previously approved for other uses to patients. Cancer 2023, 129, 1626.
- European Medicines Agency (EMA). Establishing efficacy based on single-arm trials submitted as pivotal evidence in a marketing authorisation. Published online on 21 April2023. Available online: https://www.ema.europa.eu/en/establishing-efficacy-based-single-arm-trials-submitted-pivotal-evidence-marketing-authorisation (accessed on 7 May 2025).
- Hernán, M.A. and J.M. Robins. Using big data to emulate a target trial when a randomized trial is not available. Am. J. Epidemiol. 2016, 183, 758-764.
- Bucher, H.C. and F. Chammartin. Strengthening health technology assessment for cancer treatments in Europe by integrating causal inference and target trial emulation. The Lancet Regional Health–Europe 2025, 52.
- Franklin, J.M., R.J. Glynn, D. Martin, and S. Schneeweiss. Evaluating the use of nonrandomized real-world data analyses for regulatory decision making. Clin. Pharmacol. Ther. 2019, 105, 867-877.
- Krüger, R., C. Cantoni, and A. Van Engen. OP49 Are Propensity-Score-Based Adjusted Indirect Comparisons Feasible For All European Joint Clinical Assessments Based On Non-Randomized Data? Int. J. Technol. Assess. Health Care 2024, 40, S23-S23.
- Elliott, J.H., T. Turner, O. Clavisi, J. Thomas, J.P. Higgins, C. Mavergames, and R.L. Gruen. Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap. PLoS Med. 2014, 11, e1001603.
- Stensrud, M.J. and M.A. Hernán. Why test for proportional hazards? JAMA 2020, 323, 1401-1402.
- Lin, L. Use of Prediction Intervals in Network Meta-analysis. JAMA Netw Open 2019, 2, e199735.
- European Medicines Agency (EMA). Reflection paper on establishing efficacy based on single-arm trials submitted as pivotal evidence in a marketing authorisation application. Considerations on evidence from single-arm trials. Published online r 9, 9 September 2024. Available online: https://www.ema.europa.eu/en/documents/scientific-guideline/reflection-paper-establishing-efficacy-based-single-arm-trials-submitted-pivotal-evidence-marketing-authorisation-application_en.pdf (accessed on 7 May 2025).
- Khachatryan, A., S.H. Read, and T. Madison. External control arms for rare diseases: building a body of supporting evidence. J. Pharmacokinet. Pharmacodyn. 2023, 50, 501-506.
- Wang, M., H. Ma, Y. Shi, H. Ni, C. Qin, and C. Ji. Single-arm clinical trials: design, ethics, principles. BMJ Support Palliat Care 2024, 15, 46-54.
- European Commission. Joint Clinical Assessment for Medicinal Products. Available online: https://health.ec.europa.eu/document/download/ced91156-ffe1-472d-85eb-aa6a91dd707e_en?filename=hta_htar_factsheet-jca_en.pdf (accessed on 12 May 2025).
- Guyatt, G.H., A.D. Oxman, R. Kunz, D. Atkins, J. Brozek, G. Vist, P. Alderson, P. Glasziou, Y. Falck-Ytter, and H.J. Schünemann. GRADE guidelines: 2. Framing the question and deciding on important outcomes. J. Clin. Epidemiol. 2011, 64, 395-400.
- Werner, S., L. Lechterbeck, A. Rasch, S. Merkesdal, and J. Ruof. Untersuchung der Akzeptanz und der Ablehnungsgründe indirekter Vergleiche in IQWiG-Nutzenbewertungen. Gesundheitsökonomie & Qualitätsmanagement 2020, 25, 24-36.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
