Guest Blog – Kevin Harris, Southampton Solent University

The following is the first in an occasional series of blogs written by colleagues working with realist methods outside the University of Exeter.  Mark and I are very grateful to Kevin Harris for his contribution.  Kevin is a Senior Lecturer at the Faculty of Business, Sport and Enterprise, Southampton Solent University, and his post is a closer look at the evaluation of The Coaching Innovation Programme…. over you to Kevin.

Hi there,

My name is Kevin Harris and I am a senior lecturer and course leader in sport development and sport policy at Southampton Solent University.

When deciding to take my PhD I was keen to do something that kept me connected with the industry I used to work in (sport for social change) and create a closer bridge between academia and industry.

Around this time I had just created the Coaching Innovation Programme which in essence mobilises student led sport, physical activity and coaching projects to residents in the community of Southampton to address community needs. For example my students have been involved with researching the needs of a community and delivering their own projects – things like combining physical literacy with maths and science in the curriculum to promote learning, all the way through to delivering sports based sessions to offer resources for homeless people. The students work with industry practitioners to address niche areas and respond to those needs.

The Coaching innovation Programme now mobilises around 30-40 of these projects every year now so this is a massive contribution to the sport development and physical activity landscape. One of the things which required and still required addressing was the unanswered questions surrounding evidence of the projects. The students would be able to refer to satisfying experiences for them, their participants and practitioners yet they would struggle to evidence the impact of their project, but even more importantly how and why their project achieved its outcomes.

This is where I saw a fantastic opportunity to apply my PhD to the CIP and come up with a monitoring and evaluation framework which would enable the students and practitioners to make sense of what they learnt from their programme. This would also address some of the issues in my field that surrounds practitioners engaging with monitoring and evaluation (M and E) which is relatively poorly carried out (Coalter, 2007). In essence I wanted to bring M and E practice closer to the practitioners and more embedded in their work.

So, then… Over the last two years I have spent a considerable amount of time reviewing certain approaches to evidence, M and E. This has been and felt like a round the world trip in itself. There is an ocean of literature and approaches out there. Given the nature of the complexity of the interventions my students are implementing it was no surprise like many of us that I found myself exploring the philosophical roots of critical realism and the emphasis of programme theory. This was after tinkering with other aspects of programme theory such as logic models whose operational logic failed to really capture what we were trying to do.

The realistic angle of programme theory and evaluation really started to take hold as it really fitted with the nature of the interventions around producing multiple outcomes for different people in different contexts and firing mechanisms. around conceptual logic. This led me to produce two models focusing on the formation of programme theory and monitoring and evaluation. The first which would take students through the steps of developing their own candidate programme theory, borrowing the principles of Pawson and Tilley’s (1997)s realist approach and combining with other aspects of operational logic by anatomising the programme Funnel and Rogers (2011). By this I simply mean outlining and breaking down the programme strategy into its components – eg activities, inputs, outputs. Made up of three stages the first model would firstly map the field and establish the context whereby students would carry out a situational analysis such as looking at the geography of the area, the needs of the participants and contact with stakeholders. This would then inform stage 2 which would enable the students and practitioners / additional stakeholders to anatomise their programme (light touch logic model) and establish the key outcomes / subsidiary theories which constitute their project.

These would usually constitute ‘if’ and ‘then’ assumptions which would lead to stage 3 which goes on to explain how and why those outcomes may come about. It is this stage I suppose that really captures the realistic lens of conjecturing CMO configurations getting to heart of explaining how and why and for whom the outcomes might work. By the end of these stages the students and practitioners have themselves a robust and rigorously constructed candidate programme theory.

Of course the next step is to then test the theory through project delivery and M and E. This is where model 2 comes in which takes the students through 6 key stages of programme evaluation within a light touch realistic approach. For example , part 1 consists of reconceptualising and refreshing programme theory, part 2 consists of developing and framing evaluation questions within the realistic lens (eg what works for whom in what circumstances and why). These questions are particularly constructed against each CMO conjecture but NOT all of them. Like Pawson 2012 states, steady your fire! Part 3 and 4 involves establishing methodological competency and agreement of methods to answer questions and part 5 covers data analysis with 6 covering the reporting of data. These 6 parts have been produced using participatory evaluation approach which has involved the students throughout via cooperative enquiry and training / facilitation (Greenwood and  Levin, 2007; Fetterman, 2005) from myself. The aim was to train / facilitate the student practitioners to be able to carry out realistic techniques for their M and E. Thus, I am testing the model.

At this stage I have just completed (nearly) my first pilot of the model by working with 6 student projects. The workshops have been delivered and supported by action learning sets engaging in discussion with the students and progressing their M and E. My aim is to reach Mphil transfer this summer by exploring the utility of the model(s), the extent of students engagement and praxis in M and E.

Key Challenges:

 

Firstly , teaching and stimulating interest in the area of M and E is hard, especially for young practitioners! This is particularly hard given the language used with Realist .Evalutation and in many respects students simply do not get it. The academic discourse in which it resides presents a challenge for unlocking its potential for people working on the ground. The nature of the projects themselves and the time it takes to employ a realistic evaluation has also been challenging for the students. For example, how do you uncover the generative mechanisms for change of 9 year old children? All this in addition to the many other priorities of university work load and life for the students I have been working with.

In addition, the conceptual obstacles that realsitic evaluation presents is also a major challenge. Mid range theory, Demi regularities, conjectures, mechanisms and theory riddle the literature on this. This creates major obstacles for practitioners.I don’t think that it’s the different way of seeing’ that realist approaches advocate  (eg why things work) that cause the problem. It is more about the language and understanding of how to identify mechanisms of change. I initially wondered when developing the model whether the conceptual nature of the realistic approach would be suitable for practitioners. My initial thoughts were that it should given that such an insightful method should not be constrained to academic discourses. Having posted this in the RAMESES mailing list, thankfully the guru herself Gill Westhorp stated that it’s entirely appropriate for practitioners to be introduced to the approach. Why shouldn’t practitioners engage with such techniques?

The key Gill said was to communicate it in a way that does not confuse the language and can meet the contextual needs of practitioners and how they engage. This is something I have tried to follow. By far, the hardest thing to grasp is the ever elusive programme mechanism. Having attendee the realist training workshop in Liverpool this March Justin Jagosh did a great job in explaining ways to identify a mechanism. In that programme activities within out candidate theory provide resources and opportunities and those resources and opportunities produce reasoning in the minds of the programme users. The key is identifying what resources are they … what opportunities are they and how might the participants respond to them. These are the ingredients which then produce the mechanism for certain people (for whom).

I am really keen to benchmark with people on this. I really value the realsitic approach yet promoting it within a simplistic and pragmatic way for students and practitioners is a key area for discussion.

Links to resources:

  1. The Coaching Innovation Programme  http://www.youtube.com/watch?v=La1vUuMNoMU
  2. Kevin Harris profile  http://www.solent.ac.uk/faculties/fbse/staff-profiles/harris-kevin.aspx

References:

 

Coalter, F. (2007). A wider Social role for sport.  Oxon: Routledge

Funnell, S. and Rogers, P 2011. Purposeful programme theory.  Affective use of theory of change and logic models . San Francisco, USA: Jossey Bass

Pawson, R. (2003). Nothing as Practical as a Good Theory.  Evaluation 2003 9: 471

Pawson, R and Tilley, N. (1997) Realistic Evaluation. London: Sage

Greenwood, DJ and Levin, M. (2007) Introduction to Action Research 2nd edition – social research for social change. California: Sage

 

 

 

 

 

Leave a Reply