Deakin University
Browse

File(s) under permanent embargo

Applying systems theory to the evaluation of a whole school approach to violence prevention

journal contribution
posted on 2016-12-05, 00:00 authored by S Kearney, L Leung, A Joyce, Debbie OllisDebbie Ollis, C Green
Issue addressed: Our Watch led a complex 12-month evaluation of a whole school approach to Respectful Relationships Education (RRE) implemented in 19 schools. RRE is an emerging field aimed at preventing gender-based violence. This paper will illustrate how from an implementation science perspective, the evaluation was a critical element in the change process at both a school and policy level.
Methods: Using several conceptual approaches from systems science, the evaluation sought to examine how the multiple systems layers – student, teacher, school, community and government – interacted and influenced each other. A distinguishing feature of the evaluation included ‘feedback loops’; that is, evaluation data was provided to participants as it became available. Evaluation tools included a combination of standardised surveys (with pre- and post-intervention data provided to schools via individualised reports), reflection tools, regular reflection interviews and summative focus groups.
Results: Data was shared during implementation with project staff, department staff and schools to support continuous improvement at these multiple systems levels. In complex settings, implementation can vary according to context; and the impact of evaluation processes, tools and findings differed across the schools. Interviews and focus groups conducted at the end of the project illustrated which of these methods were instrumental in motivating change and engaging stakeholders at both a school and departmental level and why.
Conclusion: The evaluation methods were a critical component of the pilot’s approach, helping to shape implementation through data feedback loops and reflective practice for ongoing, responsive and continuous improvement. Future health promotion research on complex interventions needs to examine how the evaluation itself is influencing implementation.
So what? The pilot has demonstrated that the evaluation, including feedback loops to inform project activity, were an asset to implementation. This has implications for other health promotion activities, where evaluation tools could be utilised to enhance, rather than simply measure, an intervention. The findings are relevant to a range of health promotion research activities because they demonstrate the importance of meta-evaluation techniques that seek to understand how the evaluation itself was influencing implementation and outcomes.

History

Journal

Health promotion journal of Australia

Volume

27

Issue

3

Pagination

2301 - 235

Publisher

CSIRO Publishing

Location

Melbourne, Vic.

ISSN

1036-1073

Language

eng

Publication classification

C Journal article; C1.1 Refereed article in a scholarly journal

Copyright notice

2016, CSIRO Publishing