Browsing by Author "Kim, Mimi"
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item Open Access Conceptualizing trust in community-academic research partnerships using concept mapping approach: A multi-CTSA study.(Evaluation and program planning, 2018-02) Dave, Gaurav; Frerichs, Leah; Jones, Jennifer; Kim, Mimi; Schaal, Jennifer; Vassar, Stefanie; Varma, Deepthi; Striley, Catherine; Ruktanonchai, Corrine; Black, Adina; Hankins, Jennifer; Lovelady, Nakita; Cene, Crystal; Green, Melissa; Young, Tiffany; Tiwari, Shristi; Cheney, Ann; Cottler, Linda; Sullivan, Greer; Brown, Arleen; Burke, Jessica; Corbie-Smith, GiselleObjectives
Collaborations between communities, healthcare practices and academic institutions are a strategy to address health disparities. Trust is critical in the development and maintaining of effective collaborations. The aim of this pilot study was to engage stakeholders in defining determinants of trust in community academic research partnerships and to develop a framework for measuring trust.Methods
The study was conducted by five collaborating National Institute of Health' Clinical and Translational Sciences Awardees. We used concept mapping to engage three stakeholders: community members, healthcare providers and academicians. We conducted hierarchical cluster analysis to assess the determinants of trust in community-academic research partnerships.Results
A total of 186 participants provided input generating 2,172 items that were consolidated into 125 unique items. A five cluster solution was defined: authentic, effective and transparent communication; mutually respectful and reciprocal relationships; sustainability; committed partnerships; and, communication, credibility and methodology to anticipate and resolve problems.Conclusion
Results from this study contribute to an increasing empirical body of work to better understand and improve the underlying factors that contribute to building and sustaining trust in community academic research partnerships.Item Open Access Correction to: The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.(Implementation science : IS, 2020-01) Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine AFollowing publication of the original article [1] the authors reported an important acknowledgement was mistakenly omitted from the 'Acknowledgements' section. The full acknowledgement is included in this Correction article.Item Open Access Correlates of Successful Aging in Racial and Ethnic Minority Women Age 80 Years and Older: Findings from the Women’s Health Initiative(The Journals of Gerontology Series A: Biological Sciences and Medical Sciences, 2016-03) Cené, Crystal W; Dilworth-Anderson, Peggye; Leng, Iris; Garcia, Lorena; Benavente, Viola; Rosal, Milagros; Vaughan, Leslie; Coker, Laura H; Corbie-Smith, Giselle; Kim, Mimi; Bell, Christina L; Robinson, Jennifer G; Manson, JoAnn E; Cochrane, BarbaraItem Open Access Design of the North Carolina Prostate Cancer Comparative Effectiveness and Survivorship Study (NC ProCESS).(Journal of comparative effectiveness research, 2015-01) Chen, Ronald C; Carpenter, William R; Kim, Mimi; Hendrix, Laura H; Agans, Robert P; Meyer, Anne-Marie; Hoffmeyer, Anna; Reeve, Bryce B; Nielsen, Matthew E; Usinger, Deborah S; Strigo, Tara S; Jackman, Anne M; Anderson, Mary; Godley, Paul AThe North Carolina Prostate Cancer Comparative Effectiveness & Survivorship Study (NC ProCESS) was designed in collaboration with stakeholders to compare the effectiveness of different treatment options for localized prostate cancer. Using the Rapid Case Ascertainment system of the North Carolina Central Cancer Registry, 1,419 patients (57% of eligible) with newly-diagnosed localized prostate cancer were enrolled from January 2011 to June 2013, on average 5 weeks after diagnosis. All participants were enrolled prior to treatment and this population-based cohort is sociodemographically diverse. Prospective follow-up continues to collect data on treatments received, disease control, survival and patient-reported outcomes. This study highlights several important considerations regarding stakeholder involvement, study design and generalizability regarding comparative effectiveness research in prostate cancer.Item Open Access Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria.(Implementation science : IS, 2015-11) Lewis, Cara C; Fischer, Sarah; Weiner, Bryan J; Stanick, Cameo; Kim, Mimi; Martinez, Ruben GBackground
High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings.Methods
Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile.Results
We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity.Conclusions
Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.Item Open Access Stakeholder Perspectives on Creating and Maintaining Trust in Community-Academic Research Partnerships.(Health education & behavior : the official publication of the Society for Public Health Education, 2017-02) Frerichs, Leah; Kim, Mimi; Dave, Gaurav; Cheney, Ann; Hassmiller Lich, Kristen; Jones, Jennifer; Young, Tiffany L; Cene, Crystal W; Varma, Deepthi S; Schaal, Jennifer; Black, Adina; Striley, Catherine W; Vassar, Stefanie; Sullivan, Greer; Cottler, Linda B; Brown, Arleen; Burke, Jessica G; Corbie-Smith, GiselleCommunity-academic research partnerships aim to build stakeholder trust in order to improve the reach and translation of health research, but there is limited empirical research regarding effective ways to build trust. This multisite study was launched to identify similarities and differences among stakeholders' perspectives of antecedents to trust in research partnerships. In 2013-2014, we conducted a mixed-methods concept mapping study with participants from three major stakeholder groups who identified and rated the importance of different antecedents of trust on a 5-point Likert-type scale. Study participants were community members ( n = 66), health care providers ( n = 38), and academic researchers ( n = 44). All stakeholder groups rated "authentic communication" and "reciprocal relationships" the highest in importance. Community members rated "communication/methodology to resolve problems" ( M = 4.23, SD = 0.58) significantly higher than academic researchers ( M = 3.87, SD = 0.67) and health care providers ( M = 3.89, SD = 0.62; p < .01) and had different perspectives regarding the importance of issues related to "sustainability." The importance of communication and relationships across stakeholders indicates the importance of colearning processes that involve the exchange of knowledge and skills. The differences uncovered suggest specific areas where attention and skill building may be needed to improve trust within partnerships. More research on how partnerships can improve communication specific to problem solving and sustainability is merited.Item Open Access The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.(Implementation science : IS, 2015-01) Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine ABackground
Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes.Methods
The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository.Results
To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied.Conclusions
The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.