TY - JOUR
T1 - The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action
AU - Reynolds, Joanna
AU - DiLiberto, Deborah
AU - Mangham-Jefferies, Lindsay
AU - Ansah, Evelyn K.
AU - Lal, Sham
AU - Mbakilwa, Hilda
AU - Bruxvoort, Katia
AU - Webster, Jayne
AU - Vestergaard, Lasse S.
AU - Yeung, Shunmay
AU - Leslie, Toby
AU - Hutchinson, Eleanor
AU - Reyburn, Hugh
AU - Lalloo, David
AU - Schellenberg, David
AU - Cundill, Bonnie
AU - Staedke, Sarah
AU - Wiseman, Virginia
AU - Goodman, Catherine
AU - Chandler, Clare I.R.
PY - 2014/6/17
Y1 - 2014/6/17
N2 - BackgroundThere is increasing recognition among trialists of the challenges in understanding how particular ‘real-life’ contexts influence the delivery and receipt of complex health interventions. Evaluations of interventions to change health worker and/or patient behaviours in health service settings exemplify these challenges. When interpreting evaluation data, deviation from intended intervention implementation is accounted for through process evaluations of fidelity, reach, and intensity. However, no such systematic approach has been proposed to account for the way evaluation activities may deviate in practice from assumptions made when data are interpreted.MethodsA collective case study was conducted to explore experiences of undertaking evaluation activities in the real-life contexts of nine complex intervention trials seeking to improve appropriate diagnosis and treatment of malaria in varied health service settings. Multiple sources of data were used, including in-depth interviews with investigators, participant-observation of studies, and rounds of discussion and reflection.Results and discussionFrom our experiences of the realities of conducting these evaluations, we identified six key ‘lessons learned’ about ways to become aware of and manage aspects of the fabric of trials involving the interface of researchers, fieldworkers, participants and data collection tools that may affect the intended production of data and interpretation of findings. These lessons included: foster a shared understanding across the study team of how individual practices contribute to the study goals; promote and facilitate within-team communications for ongoing reflection on the progress of the evaluation; establish processes for ongoing collaboration and dialogue between sub-study teams; the importance of a field research coordinator bridging everyday project management with scientific oversight; collect and review reflective field notes on the progress of the evaluation to aid interpretation of outcomes; and these approaches should help the identification of and reflection on possible overlaps between the evaluation and intervention.ConclusionThe lessons we have drawn point to the principle of reflexivity that, we argue, needs to become part of standard practice in the conduct of evaluations of complex interventions to promote more meaningful interpretations of the effects of an intervention and to better inform future implementation and decision-making.
AB - BackgroundThere is increasing recognition among trialists of the challenges in understanding how particular ‘real-life’ contexts influence the delivery and receipt of complex health interventions. Evaluations of interventions to change health worker and/or patient behaviours in health service settings exemplify these challenges. When interpreting evaluation data, deviation from intended intervention implementation is accounted for through process evaluations of fidelity, reach, and intensity. However, no such systematic approach has been proposed to account for the way evaluation activities may deviate in practice from assumptions made when data are interpreted.MethodsA collective case study was conducted to explore experiences of undertaking evaluation activities in the real-life contexts of nine complex intervention trials seeking to improve appropriate diagnosis and treatment of malaria in varied health service settings. Multiple sources of data were used, including in-depth interviews with investigators, participant-observation of studies, and rounds of discussion and reflection.Results and discussionFrom our experiences of the realities of conducting these evaluations, we identified six key ‘lessons learned’ about ways to become aware of and manage aspects of the fabric of trials involving the interface of researchers, fieldworkers, participants and data collection tools that may affect the intended production of data and interpretation of findings. These lessons included: foster a shared understanding across the study team of how individual practices contribute to the study goals; promote and facilitate within-team communications for ongoing reflection on the progress of the evaluation; establish processes for ongoing collaboration and dialogue between sub-study teams; the importance of a field research coordinator bridging everyday project management with scientific oversight; collect and review reflective field notes on the progress of the evaluation to aid interpretation of outcomes; and these approaches should help the identification of and reflection on possible overlaps between the evaluation and intervention.ConclusionThe lessons we have drawn point to the principle of reflexivity that, we argue, needs to become part of standard practice in the conduct of evaluations of complex interventions to promote more meaningful interpretations of the effects of an intervention and to better inform future implementation and decision-making.
KW - Behavioural interventions
KW - Complex interventions
KW - Evaluation
KW - Health service
KW - Low-income setting
KW - Reflection
KW - Trials
U2 - 10.1186/1748-5908-9-75
DO - 10.1186/1748-5908-9-75
M3 - Article
SN - 1748-5908
VL - 9
SP - e75
JO - Implementation Science
JF - Implementation Science
IS - 1
M1 - 75
ER -