A process evaluation accompanying an attempted randomized controlled trial of an evidence service for health system policymakers

Background: We developed an evidence service that draws inputs from Health Systems Evidence (HSE), which is a comprehensive database of research evidence about governance, financial and delivery arrangements within health systems and about implementation strategies relevant to health systems. Our...

Full description

Saved in:
Bibliographic Details
Main Authors: Wilson, Michael G, Grimshaw, Jeremy M, Haynes, R Brian, Hanna, Steven E, Raina, Parminder, Gruen, Russell, Ouimet, Mathieu, Lavis, John N
Other Authors: Lee Kong Chian School of Medicine (LKCMedicine)
Format: Article
Language:English
Published: 2016
Subjects:
Online Access:https://hdl.handle.net/10356/81665
http://hdl.handle.net/10220/39560
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Background: We developed an evidence service that draws inputs from Health Systems Evidence (HSE), which is a comprehensive database of research evidence about governance, financial and delivery arrangements within health systems and about implementation strategies relevant to health systems. Our goal was to evaluate whether, how and why a ‘full-serve’ evidence service increases the use of synthesized research evidence by policy analysts and advisors in the Ontario Ministry of Health and Long-Term Care as compared to a ‘self-serve’ evidence service. Methods: We attempted to conduct a two-arm, 10-month randomized controlled trial (RCT), along with a follow-up qualitative process evaluation, but we terminated the RCT when we failed to reach our recruitment target. For the qualitative process evaluation we modified the original interview guide to allow us to explore the (1) factors influencing participation in the trial; (2) usage of HSE, factors explaining usage patterns, and strategies to increase usage; (3) participation in training workshops and use of other supports; and (4) views about and experiences with key HSE features. Results: We terminated the RCT given our 15% recruitment rate. Six factors were identified by those who had agreed to participate in the trial as encouraging their participation: relevance of the study to participants’ own work; familiarity with the researchers; personal view of the importance of using research evidence in policymaking; academic background; support from supervisors; and participation of colleagues. Most reported that they never, infrequently or inconsistently used HSE and suggested strategies to increase its use, including regular email reminders and employee training. However, only two participants indicated that employee training, in the form of a workshop about finding and using research evidence, had influenced their use of HSE. Most participants found HSE features to be intuitive and helpful, although registration/sign-in and some page formats (particularly the advanced search page and detailed search results page) discouraged their use or did not optimize the user experience. Conclusions: The qualitative findings informed a re-design of HSE, which allows users to more efficiently find and use research evidence about how to strengthen or reform health systems or in how to get cost-effective programs, services and drugs to those who need them. Our experience with RCT recruitment suggests the need to consider changing the unit of allocation to divisions instead of individuals within divisions, among other lessons.