Guise JM, Butler M, Chang C, Viswanathan M, Pigott T, Tugwell P, Complex Interventions Workgroup. AHRQ Series on Complex Intervention Systematic Reviews - Paper 7: PRISMA-CI Elaboration & Explanation. J.Clin.Epidemiol. Epub 2017 Jun 30. PMID: 28720513.

BACKGROUND:
Complex interventions are widely used in health care, public health, education, criminology, social work, business, and welfare. They have increasingly become the subject of systematic reviews and are challenging to effectively report. The Complex Interventions Methods Workgroup developed an extension to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Complex Interventions (PRISMA-CI).
RATIONALE:
Following the EQUATOR Network guidance for Preferred Reporting Items for Systematic Reviews and Meta-Analysis extensions, this Explanation and Elaboration (EE) document accompanies the PRISMA-CI checklist to promote consistency in reporting of systematic reviews of complex interventions.
DISCUSSIONS:
The EE document explains the meaning and rationale for each unique PRISMA-CI checklist item and provides examples to assist systematic review authors in operationalizing PRISMA-CI guidance. The Complex Interventions Workgroup developed PRISMA-CI as an important start toward increased consistency in reporting of systematic reviews of complex interventions. Because the field is rapidly expanding, the Complex Interventions Methods Workgroup plans to re-evaluate periodically for the need to add increasing specificity and examples as the field matures.
Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30642-X/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.017.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720513.

Kelly MP, Noyes J, Kane RL, Chang C, Uhl S, Robinson KA, Springs S, Butler ME, Guise JM. AHRQ Series on Complex Intervention Systematic Reviews - Paper 2: Defining Complexity, Formulating Scope and Questions. J.Clin.Epidemiol. Epub 2017 Jun 29. PMID: 28720514.

BACKGROUND:
The early stages of a systematic review set the scope and expectations. This can be particularly challenging for complex interventions given their multi-dimensional and dynamic nature.
RATIONALE:
This paper builds on concepts introduced in Paper 1 of this series. It describes the methodological, practical and philosophical challenges and potential approaches for formulating the questions and scope of systematic reviews of complex interventions. Further it discusses the use of theory to help organize reviews of complex interventions.
DISCUSSION:
Many interventions in medicine, public health, education, social services, behavioral health, and community programs are complex, and they may not fit neatly within the established paradigm for reviews of straight-forward interventions. This paper provides conceptual and operational guidance for these early stages of scope formulation to assist authors of systematic reviews of complex interventions.
Copyright © 2017. Published by Elsevier Inc.

DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.012.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720514.

Viswanathan M, McPheeters ML, Murad MH, Butler ME, Beth Devine EE, Dyson MP, Guise JM, Kahwati LC, Miles JNV, Morton SC. AHRQ Series on Complex Intervention Systematic Reviews - Paper 4: Selecting Analytic Approaches. J.Clin.Epidemiol. Epub 2017 Jun 29. PMID: 28720515.

BACKGROUND:
Systematic reviews of complex interventions can vary widely in purpose, data availability and heterogeneity, and stakeholder expectations.
RATIONALE:
This article addresses the uncertainty that systematic reviewers face in selecting methods for reviews of complex interventions. Specifically, it lays out parameters for systematic reviewers to consider when selecting analytic approaches that best answer the questions at hand and suggests analytic techniques that may be appropriate in different circumstances.
DISCUSSION:
Systematic reviews of complex interventions comprising multiple questions may use multiple analytic approaches. Parameters to consider when choosing analytic methods for complex interventions include nature and timing of the decision (clinical practice guideline, policy, or other); purpose of the review; extent of existing evidence; logistic factors such as the timeline, process, and resources for deciding the scope of the review; and value of information to be obtained from choosing specific systematic review methods. Reviewers may elect to revise their analytic approach based on new or changing considerations during the course of the review but should guard against bias through transparency of reporting.
Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30639-X/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.014.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720515.

Guise JM, Butler ME, Chang C, Viswanathan M, Pigott T, Tugwell P, Complex Interventions Workgroup. AHRQ Series on Complex Intervention Systematic Reviews - Paper 6: PRISMA-CI Extension Statement & Checklist. J.Clin.Epidemiol. Epub 2017 Jun 28. PMID: 28720516.

BACKGROUND:
Complex interventions are widely used in health systems, public health, education, and communities and are increasingly the subject of systematic reviews. Oversimplification and inconsistencies in reporting about complex interventions can limit the usability of review findings.
RATIONALE:
Although guidance exists to ensure that reports of individual studies and systematic reviews adhere to accepted scientific standards, their design-specific focus leaves important reporting gaps relative to complex interventions in health care. This paper provides a stand-alone extension to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) reporting tool for complex interventions-PRISMA-CI-to help authors, publishers, and readers understand and apply to systematic reviews of complex interventions.
DISCUSSION:
PRISMA-CI development followed the Enhancing the QUAlity and Transparency Of health Research Network guidance for extensions and focused on adding or modifying only essential items that are truly unique to complex interventions and are not covered by broader interpretation of current PRISMA guidance. PRISMA-CI provides an important structure and guidance for systematic reviews and meta-analyses for the highly prevalent and dynamic field of complex interventions.
Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30641-8/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.016.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720516.

Butler M, Epstein RA, Totten A, Whitlock EP, Ansari MT, Damschroder LJ, Balk E, Bass EB, Berkman ND, Hempel S, et al. AHRQ Series on Complex Intervention Systematic Reviews - Paper 3: Adapting frameworks to develop protocols. J.Clin.Epidemiol. Epub 2017 Jun 27. PMID: 28720510.

BACKGROUND:
Once a proposed topic has been identified for a systematic review and has undergone a question formulation stage, a protocol must be developed that specifies the scope and research questions in detail and outlines the methodology for conducting the systematic review.
RATIONALE:
Framework modifications are often needed to accommodate increased complexity. We describe and give examples of adaptations as well as alternatives to traditional analytic frameworks.
DISCUSSION:
This paper identifies and describes elements of frameworks and how they can be adapted to inform the protocol and conduct of systematic reviews of complex interventions. Modifications may be needed to adapt the PICO normally used in protocol development in order to successfully describe complex interventions; in some instances alternative frameworks may be better suited. Possible approaches to analytic frameworks for complex interventions that illustrate causal and associative linkages, are outlined, including time elements, that systematic reviews of complex interventions may need to address. The need for and specifics of the accommodations vary with details of a specific systematic review. This in turn helps determine whether traditional frameworks are sufficient, can be refined, or if alternate frameworks must be adopted.
Copyright © 2017 Elsevier Inc. All rights reserved.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30632-7/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.013.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720510.

Guise JM, Chang C, Butler ME, Viswanathan M, Tugwell P. AHRQ Series on Complex Intervention Systematic Reviews - Paper 1: An introduction to a series of papers that provide guidance and tools for reviews of complex interventions. J.Clin.Epidemiol. Epub 2017 Jun 27. PMID: 28720511.

[First paragraph, reference html links removed]

Issues of complexity are taking primacy as research increasingly reflects the complexity of the world around us. Although advances in science have resulted in dramatic improvements in health and longevity worldwide, there is increasing recognition that the effectiveness even of apparently simple interventions is often influenced by complex interplays of individual characteristics, social determinants, the health care delivery system, and the interventions themselves. Systematic reviews of topics, such as slum upgrading [1,2], behavioral interventions for autism [3,4], smoking cessation in pregnancy [5], and the integration of mental health in primary care [6,7], illustrate that the boundaries of traditional reviews and review methods are being expanded and that reviewers are in need of guidance and tools to address this new approach.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30630-3/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.011.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720511.

Pigott T, Noyes J, Umscheid CA, Myers E, Morton SC, Fu R, Sanders-Schmidler GD, Devine EEB, Murad MH, Kelly MP, et al. AHRQ Series on Complex Intervention Systematic Reviews - Paper 5: Advanced Analytic Methods. J.Clin.Epidemiol. Epub 2017 Jun 27. PMID: 28720512.

BACKGROUND:
Advanced analytic methods for synthesizing evidence about complex interventions continue to be developed. In this paper, we emphasize that the specific research question posed in the review should be used as a guide for choosing the appropriate analytic method.
RATIONALE:
We present advanced analytic approaches that address four common questions that guide reviews of complex interventions: 1) How effective is the intervention?; 2) For whom does the intervention work and in what contexts?; 3) What happens when the intervention is implemented?; and 4) What decisions are possible given the results of the synthesis?
DISCUSSION:
The analytic approaches presented in this paper are particularly useful when each primary study differs in components, mechanisms of action, context, implementation, timing, and many other domains.
Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

FREE FULL TEXT: http://www.jclinepi.com/article/S0895-4356(17)30640-6/pdf
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.06.015.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28720512.

Rutter H, Savona N, Glonti K, Bibby J, Cummins S, Finegood DT, Greaves F, Harper L, Hawe P, Moore L, et al. The need for a complex systems model of evidence for public health. Lancet. Epub 2017 Jun 13. PMID: 28622953.

[First paragraph, reference html links removed]

Despite major investment in both research and policy, many pressing contemporary public health challenges remain. To date, the evidence underpinning responses to these challenges has largely been generated by tools and methods that were developed to answer questions about the effectiveness of clinical interventions, and as such are grounded in linear models of cause and effect. Identification, implementation, and evaluation of effective responses to major public health challenges require a wider set of approaches1,2 and a focus on complex systems.

DOI: http://dx.doi.org/10.1016/S0140-6736(17)31267-9.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28622953.

Meinecke AK, Welsing P, Kafatos G, Burke D, Trelle S, Kubin M, Nachbaur G, Egger M, Zuidgeest M, work package 3 of the GetReal consortium. Series: Pragmatic trials and real world evidence :Paper 8 Data collection and management Data collection in Pragmatic Trials. J.Clin.Epidemiol. Epub 2017 Jul 14. PMID: 28716504.

Pragmatic trials can improve our understanding of how treatments will perform in routine practice. In a series of eight papers, the GetReal Consortium has evaluated the challenges in designing and conducting pragmatic trials and their specific methodological, operational, regulatory and ethical implications. The present final paper of the series discusses the operational and methodological challenges of data collection in pragmatic trials. A more pragmatic data collection needs to balance the delivery of highly accurate and complete data with minimizing the level of interference that data entry and verification induce with clinical practice. Further it should allow for the involvement of a representative sample of practices, physicians and patients who prescribe/receive treatment in routine care. This paper discusses challenges that are related to the different methods of data collection and presents potential solutions where possible. No one-size-fits-all recommendation can be given for the collection of data in pragmatic trials, although in general the application of existing routinely used data collection systems and processes seem to best suit the pragmatic approach. However, data access and privacy, the time points of data collection, the level of detail in the data and the lack of a clear understanding of the data collection process were identified as main challenges for the usage of routinely collected data in pragmatic trials. A first step should be to determine to what extend existing healthcare databases provide the necessary study data and can accommodate data collection and management. When more elaborate or detailed data collection or more structured follow-up is required, data collection in a pragmatic trial will have to be tailor-made, often using a hybrid approach utilizing a dedicated electronic case report form (eCRF). In this case the eCRF should be kept as simple as possible to reduce the burden for practitioners and minimize influence on routine clinical practice.

FREE FULL TEXT: http://ac.els-cdn.com/S089543561730776X/1-s2.0-S089543561730776X-main.pdf?_tid=0d3a9b3e-717a-11e7-8021-00000aacb360&acdnat=1501015648_a279b3f4cc28d8eb146479b806c74e75
DOI: http://dx.doi.org/10.1016/j.jclinepi.2017.07.003.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28716504.

Goldstein KM, Vogt D, Hamilton A, Frayne SM, Gierisch J, Blakeney J, Sadler A, Bean-Mayberry BM, Carney D, DiLeone B, et al. Practice-based research networks add value to evidence-based quality improvement. Healthc.(Amst). Epub 2017 Jul 12. PMID: 28711505.

[First paragraph, reference html links removed]

Taking research findings from scientific publications to the bedside can be a slow process. One barrier to uptake of research evidence is that findings from efficacy trials conducted in controlled settings may not adapt easily to real-world situations. This process often requires multiple steps1 that are complicated by tension between the need to adapt to constraints in local care delivery, and the need to maintain fidelity to the proven intervention. Evidence-Based Quality Improvement (EBQI) offers a structured process to address this tension.

DOI: http://dx.doi.org/10.1016/j.hjdsi.2017.06.008.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28711505.

Localio AR, Stack CB, Griswold ME. Sensitivity Analysis for Unmeasured Confounding: E-Values for Observational Studies. Ann.Intern.Med. Epub 2017 Jul 11. PMID: 28693037.

[First paragraph, reference html links removed]

In their current article in Annals, VanderWeele and Ding (1) introduce the “E-value” as a simple measure of the potential for bias arising from unmeasured confounders in observational studies. Bias often poses a greater threat to the validity of reported findings than does the random variability reflected by P values and confidence bounds (2). Although the potential for bias is widely known, reports of observational data often lack sensitivity analyses exploring the possible influence of bias from unobserved factors, perhaps because authors face challenges in specifying the elements and degree of possible confounding in a manner that readers can understand.

DOI: http://dx.doi.org/10.7326/M17-1485.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28693037.

VanderWeele TJ, Ding P. Sensitivity Analysis in Observational Research: Introducing the E-Value. Ann.Intern.Med. Epub 2017 Jul 11. PMID: 28693043.

Sensitivity analysis is useful in assessing how robust an association is to potential unmeasured or uncontrolled confounding. This article introduces a new measure called the "E-value," which is related to the evidence for causality in observational studies that are potentially subject to confounding. The E-value is defined as the minimum strength of association, on the risk ratio scale, that an unmeasured confounder would need to have with both the treatment and the outcome to fully explain away a specific treatment-outcome association, conditional on the measured covariates. A large E-value implies that considerable unmeasured confounding would be needed to explain away an effect estimate. A small E-value implies little unmeasured confounding would be needed to explain away an effect estimate. The authors propose that in all observational studies intended to produce evidence for causality, the E-value be reported or some other sensitivity analysis be used. They suggest calculating the E-value for both the observed association estimate (after adjustments for measured confounders) and the limit of the confidence interval closest to the null. If this were to become standard practice, the ability of the scientific community to assess evidence from observational studies would improve considerably, and ultimately, science would be strengthened.

DOI: http://dx.doi.org/10.7326/M16-2607.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28693043.

Bluhmki T, Bramlage P, Volk M, Kaltheuner M, Danne T, Rathmann W, Beyersmann J. Time-to-event methodology improved statistical evaluation in register-based health services research. J.Clin.Epidemiol. 2017 Feb;82:103-11. PMID: 27845180.

OBJECTIVES:
Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies.
STUDY DESIGN AND SETTING:
For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone.
RESULTS:
Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches.
CONCLUSION:
Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information.
Copyright © 2016 Elsevier Inc. All rights reserved.

DOI: http://dx.doi.org/10.1016/j.jclinepi.2016.11.001.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=27845180.

Goodman SN, Schneeweiss S. Posing Causal Questions When Analyzing Observational Data-Reply. JAMA. 2017 Jul 11;318(2):201-2. PMID: 28697249.

[First paragraph]

We agree with Dr Bhupathiraju and colleagues that debates over analytic approach are often surrogates for debating which scientific questions are of greatest interest. That said, they take issue with a sentence in our Editorial that characterized the difference between the original NHS analyses1,2 and the 2008 reanalysis3 as due to a design bias rather than a difference in causal question. They question whether the differences observed might be due to effects of age and estrogen type rather than the distinctions we drew between analyses of new users vs current users.

DOI: http://dx.doi.org/10.1001/jama.2017.6235.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28697249.

Henry D, Tolan P, Gorman-Smith D, Schoeny M. Alternatives to Randomized Control Trial Designs for Community-Based Prevention Evaluation. Prev.Sci. 2017 Aug;18(6):671-80. PMID: 27600286.

Multiple factors may complicate evaluation of preventive interventions, particularly in situations where the randomized controlled trial (RCT) is impractical, culturally unacceptable, or ethically questionable, as can occur with community-based efforts focused on inner-city neighborhoods or rural American Indian/Alaska Native communities. This paper is based in the premise that all research designs, including RCTs, are constrained by the extent to which they can refute the counterfactual and by which they can meet the challenge of proving the absence of effects due to the intervention-that is, showing what is prevented. Yet, these requirements also provide benchmarks for valuing alternatives to RCTs, those that have shown abilities to estimate preventive effects and refute the counterfactual with limited bias acting in congruence with community values about implementation. In this paper, we describe a number of research designs with attending examples, including regression discontinuity, interrupted time series designs, and roll-out randomization designs. We also set forth procedures and practices that can enhance their utility. Alternative designs, when combined with such design strengths, can provide valid evaluations of community-based interventions as viable alternatives to the RCT.

DOI: http://dx.doi.org/10.1007/s11121-016-0706-8.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=27600286.

Makady A, de Boer A, Hillege H, Klungel O, Goettsch W, (on behalf of GetReal Work Package 1). What Is Real-World Data? A Review of Definitions Based on Literature and Stakeholder Interviews. Value Health. 2017 Jul-Aug;20(7):858-65. PMID: 28712614.

BACKGROUND:
Despite increasing recognition of the value of real-world data (RWD), consensus on the definition of RWD is lacking.
OBJECTIVES:
To review definitions publicly available for RWD to shed light on similarities and differences between them.
METHODS:
A literature review and stakeholder interviews were used to compile data from eight groups of stakeholders. Data from documents and interviews were subjected to coding analysis. Definitions identified were classified into four categories: 1) data collected in a non-randomized controlled trial setting, 2) data collected in a non-interventional/non-controlled setting, 3) data collected in a non-experimental setting, and 4) others (i.e., data that do not fit into the other three categories). The frequency of definitions identified per category was recorded.
RESULTS:
Fifty-three documents and 20 interviews were assessed. Thirty-eight definitions were identified: 20 out of 38 definitions (53%) were category 1 definitions, 9 (24%) were category 2 definitions, 5 (13%) were category 3 definitions, and 4 (11%) were category 4 definitions. Differences were identified between, and within, definition categories. For example, opinions differed on the aspects of intervention with which non-interventional/non-controlled settings should abide. No definitions were provided in two interviews or identified in 33 documents.
CONCLUSIONS:
Most of the definitions defined RWD as data collected in a non-randomized controlled trial setting. A considerable number of definitions, however, diverged from this concept. Moreover, a significant number of authors and stakeholders did not have an official, institutional definition for RWD. Persisting variability in stakeholder definitions of RWD may lead to disparities among different stakeholders when discussing RWD use in decision making.
Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

DOI: http://dx.doi.org/10.1016/j.jval.2017.03.008.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28712614.

Visvanathan K, Levit LA, Raghavan D, Hudis CA, Wong S, Dueck A, Lyman GH. Untapped Potential of Observational Research to Inform Clinical Decision Making: American Society of Clinical Oncology Research Statement. J.Clin.Oncol. 2017 Jun 1;35(16):1845-54. PMID: 28358653.

ASCO believes that high-quality observational studies can advance evidence-based practice for cancer care and are complementary to randomized controlled trials (RCTs). Observational studies can generate hypotheses by evaluating novel exposures or biomarkers and by revealing patterns of care and relationships that might not otherwise be discovered. Researchers can then test these hypotheses in RCTs. Observational studies can also answer or inform questions that either have not been or cannot be answered by RCTs. In addition, observational studies can be used for postmarketing surveillance of new cancer treatments, particularly in vulnerable populations. The incorporation of observational research as part of clinical decision making is consistent with the position of many leading institutions. ASCO identified five overarching recommendations to enhance the role of observational research in clinical decision making: (1) improve the quality of electronic health data available for research, (2) improve interoperability and the exchange of electronic health information, (3) ensure the use of rigorous observational research methodologies, (4) promote transparent reporting of observational research studies, and (5) protect patient privacy.

DOI: http://dx.doi.org/10.1200/JCO.2017.72.6414.
PubMed: https://www.ncbi.nlm.nih.gov/pubmed/?term=28358653.