181 General discussion negative experiences of prison staff could be traced back to this design. For example, due to the series of 10 sessions, some participants ended up feeling MCD was compulsory. As researchers, we sometimes struggled with (and still reflect on) the balance between an ideal responsive approach when implementing MCD versus the needed conditions for a (quantitative) impact study. ‘Life must be lived forward but can only be understood backward.’ – Søren Kierkegaard (fragment from his diary from 1843) For impact studies, constant adjustments during the process makes measuring effects hard or even impossible because there is too much variation in the intervention. However, in the long run, such adjustments might improve the actual impact of an intervention on practice. Therefore, I think true impact of an intervention could not be reached with a fully regulated design. Our search for balance between the regulated and responsive approach seems consistent with what Verkerk & Leerssen (2005, p. 80) argue: that you need to work with and without a plan. You need to simultaneously ‘hold on’ and ‘let go’. It is impossible to prevent all challenges when researching (the impact of) interventions. A research design needs to be able to – to a certain degree – let such changes happen. Tensions in our role as researchers Holding on while at the same time letting go; this was a challenge that was also present in our role as researchers. For the quantitative impact study, you wish the conditions of sessions to be the same; hence, as researchers, you help create such ‘stable’ conditions for MCD and then collect your data. Therefore, we were quite strict on the conditions and guided the organizations in how to implement MCD. After setting the conditions, you might need to ‘let the process’ go and just measure the impact of it. However, constantly, adaptations were needed, to create the most stable conditions for MCD and for our research. For example, constant changes in the planning of MCD were needed to better accommodate teams. And moreover, based on the responsive approach, we did not just collect and analyze data; instead, we took an active role. We interacted with all stakeholders and tried to help to foster improvements along the way. For example, via our position and role in the steering committees we shared our knowledge on MCD and, when needed, we advised on how to implement MCD further. Hence, as researchers, we also influenced the MCD practice at DCIA. While trying to measure the impact of MCD, we also tried to contribute to creating a positive impact of MCD. That is a challenging context for a quantitative impact study; however, we felt this was needed to do justice 7