Publications by authors named "Cameo F Stanick"

Article Synopsis
  • A study focused on improving community mental health services assessed the impact of a psychosocial intervention called MATCH when implemented with less structured support than in previous trials, testing if positive outcomes could still be achieved.
  • A total of 59 clinicians were trained to deliver MATCH to 166 young clients, using specified process management tools instead of extensive guidance from research teams.
  • Results showed that the youth's symptom improvement matched or exceeded outcomes from earlier research trials, indicating that structured tools can work effectively in real-world settings.
View Article and Find Full Text PDF

To rigorously measure the implementation of evidence-based interventions, implementation science requires measures that have evidence of reliability and validity across different contexts and populations. Measures that can detect change over time and impact on outcomes of interest are most useful to implementers. Moreover, measures that fit the practical needs of implementers could be used to guide implementation outside of the research context.

View Article and Find Full Text PDF

Background: Organizational culture, organizational climate, and implementation climate are key organizational constructs that influence the implementation of evidence-based practices. However, there has been little systematic investigation of the availability of psychometrically strong measures that can be used to assess these constructs in behavioral health. This systematic review identified and assessed the psychometric properties of measures of organizational culture, organizational climate, implementation climate, and related subconstructs as defined by the Consolidated Framework for Implementation Research (CFIR) and Ehrhart and colleagues.

View Article and Find Full Text PDF

Following publication of the original article [1] the authors reported an important acknowledgement was mistakenly omitted from the 'Acknowledgements' section. The full acknowledgement is included in this Correction article.

View Article and Find Full Text PDF

The use of reliable, valid measures in implementation practice will remain limited without pragmatic measures. Previous research identified the need for pragmatic measures, though the characteristic identification used only expert opinion and literature review. Our team completed four studies to develop a stakeholder-driven pragmatic rating criteria for implementation measures.

View Article and Find Full Text PDF

Existing measures of attitudes toward evidence-based practices (EBPs) assess attitudes toward manualized or research-based treatments. Providers of youth behavioral health (N = 282) completed the Valued Practices Inventory (VPI), a new measure of provider attitudes toward specific practices for youth that avoids mention of EBPs by listing specific therapies-some of which are drawn from EBPs (e.g.

View Article and Find Full Text PDF

There is strong enthusiasm for utilizing implementation science in the implementation of evidence-based programs in children's community mental health, but there remains work to be done to improve the process. Despite the proliferation of implementation frameworks, there is limited literature providing case examples of overcoming implementation barriers. This article examines whether the use of three implementations strategies, a structured training and coaching program, the use of professional development portfolios for coaching, and a progress monitoring data system, help to overcome barriers to implementation by facilitating four implementation drivers at a community mental health agency.

View Article and Find Full Text PDF

Context: Implementation science measures are rarely used by stakeholders to inform and enhance clinical program change. Little is known about what makes implementation measures pragmatic (i.e.

View Article and Find Full Text PDF

Background: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct.

View Article and Find Full Text PDF

Numerous trials demonstrate that monitoring client progress and using feedback for clinical decision-making enhances treatment outcomes, but available data suggest these practices are rare in clinical settings and no psychometrically validated measures exist for assessing attitudinal barriers to these practices. This national survey of 504 clinicians collected data on attitudes toward and use of monitoring and feedback. Two new measures were developed and subjected to factor analysis: The monitoring and feedback attitudes scale (MFA), measuring general attitudes toward monitoring and feedback, and the attitudes toward standardized assessment scales-monitoring and feedback (ASA-MF), measuring attitudes toward standardized progress tools.

View Article and Find Full Text PDF

Background: Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity).

View Article and Find Full Text PDF