Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria.

dc.contributor.author

Lewis, Cara C

dc.contributor.author

Fischer, Sarah

dc.contributor.author

Weiner, Bryan J

dc.contributor.author

Stanick, Cameo

dc.contributor.author

Kim, Mimi

dc.contributor.author

Martinez, Ruben G

dc.date.accessioned

2024-04-01T14:56:40Z

dc.date.available

2024-04-01T14:56:40Z

dc.date.issued

2015-11

dc.description.abstract

Background

High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings.

Methods

Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile.

Results

We identified 104 instruments across eight constructs, with nearly half (nā€‰=ā€‰50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity.

Conclusions

Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.
dc.identifier

10.1186/s13012-015-0342-x

dc.identifier.issn

1748-5908

dc.identifier.issn

1748-5908

dc.identifier.uri

https://hdl.handle.net/10161/30423

dc.language

eng

dc.publisher

Springer Science and Business Media LLC

dc.relation.ispartof

Implementation science : IS

dc.relation.isversionof

10.1186/s13012-015-0342-x

dc.rights.uri

https://creativecommons.org/licenses/by-nc/4.0

dc.subject

Humans

dc.subject

Reproducibility of Results

dc.subject

Program Evaluation

dc.subject

Mental Health Services

dc.subject

Psychometrics

dc.subject

Diffusion of Innovation

dc.subject

Evidence-Based Practice

dc.title

Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria.

dc.type

Journal article

duke.contributor.orcid

Kim, Mimi|0000-0002-1352-9670|0000-0003-1100-1298|0000-0003-2381-3453

pubs.begin-page

155

pubs.issue

1

pubs.organisational-group

Duke

pubs.organisational-group

School of Medicine

pubs.organisational-group

Clinical Science Departments

pubs.organisational-group

Family Medicine and Community Health

pubs.organisational-group

Family Medicine and Community Health, Community Health

pubs.publication-status

Published

pubs.volume

10

Files