Entity

Time filter

Source Type

Santa Monica, CA, United States

Ovretveit J.C.,Karolinska Institutet | Shekelle P.G.,Southern California Evidence based Practice Center | Shekelle P.G.,RAND Corporation | Dy S.M.,Johns Hopkins University | And 7 more authors.
BMJ Quality and Safety | Year: 2011

Background: Logic and experience suggest that it is easier in some situations than in others to change behaviour and organisation to improve patient safety. Knowing which 'context factors' help and hinder implementation of different changes would help implementers, as well as managers, policy makers, regulators and purchasers of healthcare. It could help to judge the likely success of possible improvements, given the conditions that they have, and to decide which of these conditions could be modified to make implementation more effective. Methods: The study presented in this paper examined research to discover any evidence reported about whether or how context factors influence the effectiveness of five patient safety interventions. Results: The review found that, for these five diverse interventions, there was little strong evidence of the influence of different context factors. However, the research was not designed to investigate context influence. Conclusions: The paper suggests that significant gaps in research exist and makes proposals for future research better to inform decision-making. Source


Moher D.,Ottawa Hospital Research Institute | Moher D.,University of Ottawa | Shamseer L.,Ottawa Hospital Research Institute | Shamseer L.,University of Ottawa | And 32 more authors.
Systematic Reviews | Year: 2015

Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015). PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium. © 2015 Moher et al. Source


Shamseer L.,Ottawa Hospital Research Institute | Moher D.,Ottawa Hospital Research Institute | Clarke M.,Queens University of Belfast | Ghersi D.,National Health and Medical Research Council | And 4 more authors.
BMJ (Online) | Year: 2015

Protocols of systematic reviews and meta-analyses allow for planning and documentation of review methods, act as a guard against arbitrary decision making during review conduct, enable readers to assess for the presence of selective reporting against completed reviews, when made publicly available, reduce duplication of efforts and potentially prompt collaboration. Evidence documenting the existence of selective reporting and excessive duplication of reviews on the same or similar topics is accumulating and many calls have been made in support of the documentation and public availability of review protocols. Several efforts have emerged in recent years to rectify these problems, including development of an international register for prospective reviews (PROSPERO) and launch of the first open access journal dedicated to the exclusive publication of systematic review products, including protocols (BioMed Central's Systematic Reviews). Furthering these efforts and building on the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) guidelines, an international group of experts has created a guideline to improve the transparency, accuracy, completeness, and frequency of documented systematic review and meta-analysis protocols-PRISMA-P (for protocols) 2015. The PRISMA-P checklist contains 17 items considered to be essential and minimum components of a systematic review or meta-analysis protocol. This PRISMA-P 2015 Explanation and Elaboration paper provides readers with a full understanding of and evidence about the necessity of each item as well as a model example from an existing published protocol. This paper should be read together with the PRISMA-P 2015 statement. Systematic review authors and assessors are strongly encouraged to make use of PRISMA-P when drafting and appraising review protocols. Source


Robinson K.A.,Johns Hopkins University | Whitlock E.P.,Kaiser Permanente | Oneil M.E.,Scientific Resource Center for the Effective Health Care Program | Anderson J.K.,Scientific Resource Center for the Effective Health Care Program | And 8 more authors.
Systematic Reviews | Year: 2014

Background: An exponential increase in the number of systematic reviews published, and constrained resources for new reviews, means that there is an urgent need for guidance on explicitly and transparently integrating existing reviews into new systematic reviews. The objectives of this paper are: 1) to identify areas where existing guidance may be adopted or adapted, and 2) to suggest areas for future guidance development. Methods: We searched documents and websites from healthcare focused systematic review organizations to identify and, where available, to summarize relevant guidance on the use of existing systematic reviews. We conducted informational interviews with members of Evidence-based Practice Centers (EPCs) to gather experiences in integrating existing systematic reviews, including common issues and challenges, as well as potential solutions. Results: There was consensus among systematic review organizations and the EPCs about some aspects of incorporating existing systematic reviews into new reviews. Current guidance may be used in assessing the relevance of prior reviews and in scanning references of prior reviews to identify studies for a new review. However, areas of challenge remain. Areas in need of guidance include how to synthesize, grade the strength of, and present bodies of evidence composed of primary studies and existing systematic reviews. For instance, empiric evidence is needed regarding how to quality check data abstraction and when and how to use study-level risk of bias assessments from prior reviews. Conclusions: There remain areas of uncertainty for how to integrate existing systematic reviews into new reviews. Methods research and consensus processes among systematic review organizations are needed to develop guidance to address these challenges. © 2014 Robinson et al.; licensee BioMed Central Ltd. Source


Chung M.,Institute for Clinical Research and Health Policy Studies | Newberry S.J.,Southern California Evidence based Practice Center | Ansari M.T.,Ottawa Hospital Research Institute | Yu W.W.,Institute for Clinical Research and Health Policy Studies | And 9 more authors.
Journal of Clinical Epidemiology | Year: 2012

Objective: Apply and compare two methods that identify signals for the need to update systematic reviews, using three Evidence-based Practice Center reports on omega-3 fatty acids as test cases. Study Design and Setting: We applied the RAND method, which uses domain (subject matter) expert guidance, and a modified Ottawa method, which uses quantitative and qualitative signals. For both methods, we conducted focused electronic literature searches of recent studies using the key terms from the original reports. We assessed the agreement between the methods and qualitatively assessed the merits of each system. Results: Agreement between the two methods was "substantial" or better (kappa > 0.62) in three of the four systematic reviews. Overall agreement between the methods was "substantial" (kappa = 0.64, 95% confidence interval [CI] 0.45-0.83). Conclusion: The RAND and modified Ottawa methods appear to provide similar signals for the possible need to update systematic reviews in this pilot study. Future evaluation with a broader range of clinical topics and eventual comparisons between signals to update reports and the results of full evidence review updates will be needed. We propose a hybrid approach combining the best features of both methods, which should allow efficient review and assessment of the need to update. © 2012 Elsevier Inc. All rights reserved. Source

Discover hidden collaborations