To rigorously measure the implementation of evidence-based interventions, implementation science requires measures that have evidence of reliability and validity across different contexts and populations. Measures that can detect change over time and impact on outcomes of interest are most useful to implementers. Moreover, measures that fit the practical needs of implementers could be used to guide implementation outside of the research context.
View Article and Find Full Text PDFBackground: Organizational culture, organizational climate, and implementation climate are key organizational constructs that influence the implementation of evidence-based practices. However, there has been little systematic investigation of the availability of psychometrically strong measures that can be used to assess these constructs in behavioral health. This systematic review identified and assessed the psychometric properties of measures of organizational culture, organizational climate, implementation climate, and related subconstructs as defined by the Consolidated Framework for Implementation Research (CFIR) and Ehrhart and colleagues.
View Article and Find Full Text PDFFollowing publication of the original article [1] the authors reported an important acknowledgement was mistakenly omitted from the 'Acknowledgements' section. The full acknowledgement is included in this Correction article.
View Article and Find Full Text PDFThe use of reliable, valid measures in implementation practice will remain limited without pragmatic measures. Previous research identified the need for pragmatic measures, though the characteristic identification used only expert opinion and literature review. Our team completed four studies to develop a stakeholder-driven pragmatic rating criteria for implementation measures.
View Article and Find Full Text PDFJ Behav Health Serv Res
October 2019
Existing measures of attitudes toward evidence-based practices (EBPs) assess attitudes toward manualized or research-based treatments. Providers of youth behavioral health (N = 282) completed the Valued Practices Inventory (VPI), a new measure of provider attitudes toward specific practices for youth that avoids mention of EBPs by listing specific therapies-some of which are drawn from EBPs (e.g.
View Article and Find Full Text PDFThere is strong enthusiasm for utilizing implementation science in the implementation of evidence-based programs in children's community mental health, but there remains work to be done to improve the process. Despite the proliferation of implementation frameworks, there is limited literature providing case examples of overcoming implementation barriers. This article examines whether the use of three implementations strategies, a structured training and coaching program, the use of professional development portfolios for coaching, and a progress monitoring data system, help to overcome barriers to implementation by facilitating four implementation drivers at a community mental health agency.
View Article and Find Full Text PDFContext: Implementation science measures are rarely used by stakeholders to inform and enhance clinical program change. Little is known about what makes implementation measures pragmatic (i.e.
View Article and Find Full Text PDFBackground: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct.
View Article and Find Full Text PDFNumerous trials demonstrate that monitoring client progress and using feedback for clinical decision-making enhances treatment outcomes, but available data suggest these practices are rare in clinical settings and no psychometrically validated measures exist for assessing attitudinal barriers to these practices. This national survey of 504 clinicians collected data on attitudes toward and use of monitoring and feedback. Two new measures were developed and subjected to factor analysis: The monitoring and feedback attitudes scale (MFA), measuring general attitudes toward monitoring and feedback, and the attitudes toward standardized assessment scales-monitoring and feedback (ASA-MF), measuring attitudes toward standardized progress tools.
View Article and Find Full Text PDFBackground: Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity).
View Article and Find Full Text PDF