In defense of realist approaches… albeit a modest, middle-range, empirically-rich kind of defense*.

(This is not Apollo my cat.)

(I’d like to start this post by getting something off my chest. I love systematic reviews. I also love RCTs. I like stats (when someone reminds me how to understand them). And I appreciate a good forest plot as much as the next gal. I like being a researcher, that I get paid to think about stuff and write about it. I’d probably do it for free if there wasn’t a multi-morbid cat in my life who demands I spend the equivalent of a modest All Saints spree at the vet every month instead. All this to say, what follows is not an incitement to rise up against any particular methodology… far from it, in fact the opposite: there is more that unites us than divides us, and I seek to continue to bridge those divides.)

Earlier today, I went to lunch with esteemed colleagues and during our discussions, I gave answer to several points to people who are not overly fond of or convinced by realist approaches. I thought it would be helpful to do a blog post on this, as these are recurrent questions that come up, and these are the kinds of responses I give.

My starting point is that, as a realist who thinks that there is a mind independent reality, that is knowable, but only partially, I might know a lot about realist methods, and I might have some defences, but I am also only partially knowledgeable – there is never a point I will get to when I think “that’s it, I’m done, I know it all”. So what follows is my partial knowledge, ready for others to refute and refine, so that our collective knowledge can proceed. Each of these headings could easily have a blog post of their own, but seeing as I’m up to 250+ words already, I’ll be brief and maybe come back to them again at a later date.

Realist research promises so much, but does it deliver? / They don’t give us clear knowledge of what works and so are not helpful to decision makers.

In common with other methodologies, it depends on the quality of the actual research: from stakeholder engagement through design to data collection and writing and dissemination. What realism suggests is that there are ways of understanding how social programmes, policies or interventions work which are very useful to decision makers because in one sense, they under-promise: they can help explain in depth and with reasons, what is going on here, which in turn can help you think about what might happen there. Generalisable middle range theory is the goal of realist research, portable theory, which can be moved to different circumstances and tried out again, and in those attempts to disprove it, to find out where the theory doesn’t hold, we learn and refine theory and try again. This is the scientific method.

Where I think realist research is different is that there is not a claim of finality to the knowledge it produces: all theory is open to further development. Again, this is the scientific method… Brian Cox explaining something incredible about the Universe often starts by saying “Our best theory of this at the moment is…” This is good science, I just think realists are more upfront about it. And so when more research is needed, it is to refine what we already know, not to re-establish if what we already know is what we already know…. An example given in a lecture was of nearly 30 trials which happened after it had been established that audit and feedback tend to work in implementing behaviour change… our lecturer said what was the point of the 30 trials, and went on to talk of how what they are doing now is to compare intervention with refined intervention to get to the nub of ‘what works’… couldn’t agree more, this careful, conscious process of starting with what mostly works and refining it is realist research. See here for an interesting essay on the topic: The Realist Foundations of Evidence-Based Medicine: A Review Essay

(Following a useful conversation on Twitter about this last night and this morning (@rjlhardwick), it seems that there are several other issues to take into account when asking whether or not realist research has achieved its promise and is useful for decision makers (My first reflection on this, had I the time, would be to return to Ray’s original paper on realist synthesis which started the ‘promise’ thing and check out what he actually did promise, but I just don’t have time to do that right now… another post for a later date). My next reaction was a face palm as I recalled last year’s International Conference on Realist Research and Evaluation was called “From Promise to Practice“, and I was on the organising committee and also ran a workshop with the fabulous Lisa Burrows! How could I forget?! At the conference, an explicit requirement to present was to address how the realist work had made a difference in the real world. I don’t think there was a conference report, but I will talk to Gill Westhorpe and Emma Williams from Charles Darwin Uni to see if they are going to produce one.

Other points raised by colleagues were it’s too soon to call; there hasn’t been research done on this yet, although there has been work done on looking at the growth and development of realist research see here and here for two examples; it depends on what we mean by ‘useful’ and ‘used’, how would that be measured, and against what would it be compared – recognising that knowledge mobilisation is a relational process more than it is linear and that the processes of decision making, and how evidence influences that are complex and emergent, changing over time and space. Nevertheless, this would make a really interesting project.)

Realist research is qualitative

No it isn’t. Realism is a methodology and all methods are in, from the highest to lowliest, it is a broad church that takes all comers. Quant data tends to tell us where there are patterns of effectiveness which lead us to think ‘why’, for which we need more qual research to understand mechanisms and context. Next.

Realist researchers were (are) patronising and a bit arrogant

This was a really interesting point, and something I had previously wondered about, but being new to academia (circa 6 years), I didn’t have first-hand knowledge of this. It seems that when realist research was first talked of, those who were doing the talking were felt to be rather patronising and probably arrogant about what they were doing (and by inference, or directly, I don’t know) derogatory about those that weren’t doing what they were doing.

My question in response was “what can I do about that? I’m interested in how we move the field of research methods and methodology forward… must I forever account for the sins of my forefathers?” But it is something to bear in mind and is I reckon one of the primary reasons why much more experienced and cleverer researchers than me reject realist approaches: the people that do realist research are on some level a bit up themselves or obnoxious. I could have responded by retorting about the giant in group mentality of organisations like Cochrane or Campbell, but I would be pointing the finger back at ourselves: for whatever reason, when people get behind something, they really want an out-group to identify themselves as distinctive from. Same in life, same in politics, same in research. Sigh.

My hope is for a future where we recognise and mutually value the benefits of many different methodologies, and have moved beyond academic methodological arrogance. And to be part of that movement, I try to maintain a degree of humility in my words and an open mind when talking to other researchers that don’t share my philosophical underpinnings: I’m not so arrogant as to think that I have the whole truth, and I regard critical reflection on methods and methodology and my practice an inherent part of being a good academic.

What is the use of two different realist reviews if they find two different answers to the same question? (based on an example given over lunch.)

Ah, now I didn’t come up with a response to this at the time, but in the car coming home, I thought to myself that if this were the case, then you’d look at both to see how the arguments are put together, what the theory was and seek to adjudicate or refine the programme theory developed. All knowledge of generative mechanisms is partial, so it would not be surprising to find different results from different folks using the same, or indeed, different data: what you would have here is a golden opportunity to test out the programme theory in each review against the other and in doing so, understand better what is going on: this is my thinking on the matter, but I would love the opportunity to try this out.

Plus, to paraphrase Andrew Sayer the number of times something happens has nothing to do with why that something is happening: the focus of realist research on explanation and causation leads the search to find out why something is happening, not merely counting the instances when it did. If two realist reviews identified the same mechanisms, then that would not be surprising, because as we know, mechanisms exist as powers and liabilities, forces, simple rules or reasoning and resources, whether or not we can ‘see’ them… we observe their effects, and if two reviews observed similar things, and came to similar conclusions about mechanisms, then hurrah. And if they didn’t, then by Jove, get that deerstalker and cape, my pipe and carriage, and the games afoot!

Some of them are crap.
Yes. True. So are some RCTs, systematic reviews, ethnographies and so on. And reporting guidelines/publication standards don’t necessarily prevent this, although they are a start. Next.

It’s a bandwagon.

Also yes, true, probably. I came to realist research when doing my public health MSc in 2010. It was not a bandwagon then and no one had really heard about it**. In fact, if you said you were doing realist research, people generally looked a bit baffled. The RAMESES project was just starting. Exeter Uni was one of the first in the country to undertake realist research. Our own Richard Byng was one of the first people to do a realist evaluation for his PhD. Etc.

Since then, interest in realist approaches has grown steadily, and NIHR commission realist research now, and it seems everyone has at least heard of it, even if they’re not doing it. And a lot of places are doing it. And people are interested in it. The idea of it being a bandwagon sounds derogatory, as if realist research is in fashion now, but will soon be replaced by the new kid on the block. This could be true, but I think a correct understanding of realism, and how it applies to methods and in particular how Pawson and Tilley write and talk about it makes me think that it wont: the foundations of the methodology are embedded in the writing and thinking of great social scientists whose work influences our work to this day (Sayer, Archer, Bhaskar, Campbell, Merton, and so on). And that bunch are still cited and relevant. So I don’t think it’s going to go out of fashion.

I also wonder too why so many PhD students seem to choose it, and I have a few reflections on that: I wonder sometimes if this is because of how friendly and accessible the realist community is? The online JSCMAIL group for RAMESES is a haven of friendly and kind advice. Sometimes from the author of the RAMESES publication standards themselves! This accessibility is priceless I think, and gives realist approaches an egalitarian feel. And within the realist community, there are differences and debates about definitions, practices and so on: but entered into in my experience so far with an open mind and heart, recognising the impartial knowledge we have and how that knowledge grows through disputation and disagreement. Plus, I don’t think I’ve met a realist that I didn’t like: they’re a fun and modest group in my experience. And finally, realist research is really hard work and makes you think, sometimes I feel like my brain is turning inside out. But I like that. Weird I know.

And so if it is a bandwagon, then I don’t really care. The more the merrier I say. And it’s more fun than walking! Climb aboard!

Nevertheless, I don’t think or claim realist approaches are perfect, I know the way I practice them is in need of refinement, and I don’t necesarily think that the answer to what works, for whom, in what circumstances and why can only be found through realist research: plenty of other methodologies have other ways of approaching these things which can be useful too, (like, obvs), and which can be incorporated into a realist frame of reference too… and often are (I’m thinking of Engager here). And I think there’s room for us as realists to continue to offer constructive, collegiate criticism to each others work, and in doing so to refine and improve our understanding, methods and ultimately, in my field at least, the experiences of people running and using our healthcare services.

I doubt these answers changed the minds of my colleagues today, but I hope they offered a friendly, reasoned defense of the methodology. There really is no need to get hot under the collar when discussing methodology: we’re all in the business of trying to develop usable knowledge, and there are many ways of achieving that. So in closing, I’d just like to say that these the kinds of responses I give when faced with these kinds of questions or comments: but I’d be interested to hear yours – so do get in touch: r.j.l.hardwick@exeter.ac.uk

(Some of these musings are based on earlier reflections from a post from 2016 following the London CARES conference, see points 3 and 7 in particular.)

* Ray’s description of the kind of realism he’s interested in. A Pawson Profile
** this is me shamefully making it clear that I have been around a bit, and therefore I am implying that what I think or say is credible…it is virtue signalling, and I’m sorry… not so humble or modest at that point eh? My apologies.

Once you become real, you can’t be unreal again *

Back in October, Lisa Burrows gave a great talk at the Realist Hive about her PhD, and the process she’s been following to do her research.

Lisa’s PhD is looking at how Dementia Cafe’s work (or rather What is the effect of memory cafes as an intervention for people with dementia and their carers?). She talked about her personal motivations for looking at Dementia Cafe’s, her experience in running one and the need to develop an evidence base for dementia cafes. They are an increasingly popular community based response for people living with dementia and their families and supporters, but little is understood about how they work, for whom, in what circumstances and why.

Lisa talked passionately about why she chose a realist approach (she’s doing a review and an evaluation for her PhD), saying that it resonated strongly with her own experience that nothing works the same in all places, at all times, for everyone, and that it is important to not only understand whether or not something is effective, but also why it is. She took us through her realist review in some detail, explaining the way she went from mapping the territory of dementia cafes, to developing programme theories and “if then” propositions.

It was a useful practice based session where we listened and asked questions, and then had a crack at doing our own programme theory surfacing from a short vignette she shared.

The slides from the presentation will be added below in due course, plus the vignette at the end.

Thanks to Lisa for presenting so clearly on her work and to everyone that came along.

Example for programme theories

*This is a quote from very near the start of The Velveteen Rabbit, which Lisa shared with us at the end of her talk. You can read more here… Velveteen Rabbit

summer

We don’t talk about love / We only want to get drunk. Musings on the 2nd International CARES Conference.

I’m on the train coming back from CARES’s 2nd Annual Conference, and as this Great Western Railway train speeds through the countryside on its 2.5 hr journey back to the Shire, I have been reflecting on the conference, and wanted to share a few observations – for those who attended, and those who didn’t. The tracks are courtesy of my 19 year old self.

(CARES, in case you don’t know, stands for the Centre for the Advancement of Realist Evaluation and Synthesis. Their website is here. The best thing about their conferences, and actually any conference, is when the ‘greats’ of the method turn up and join in.)

1. London is a very big place Mr Shadrack, a very big place. A man could lose himself in London, lose himself. Lose himself in London.

I lived in London between 1994 and 2000, the heyday years of Britpop, and in my time at Goldsmiths’ College and the few years afterwards, I prided myself on getting to know the city well. I took random buses to different places I’d read about or heard of just to see what they were like. Getting lost was something that country folk did, and I prided myself on my sense of direction. Nigh on twenty years later, we now have the mobile telephone and the google map. Which would make you think that exploring a different part of the city would be easier. I don’t know the streets around the Barbican very well at all, but as I had my trusty app, I thought I’d be ok. How wrong was I? I thought I was going north when I was going south. East when west. And so on. And it was only when I stopped using the map that I started to find my way. Conclusions may be drawn from that at your leisure…

2. It always starts with the mechanism
If you’ve been around the Scientific Realism community for a while, you’ll have come across what feels like an interminable debate about CMOCs. CMOCs stands for ‘Context’, ‘Mechanism’, ‘Outcome’ configuration, and it is a socially constructed heuristic to support researchers in the process of developing programme theories. There has been much debate about the utility and necessity of ‘specifying’ CMOCs, as well as ‘how’ to do them. This dates back to the mists of time when I think Ray was trying to offer a useful tool to help researchers get a handle on ‘what is a programme theory’. The problem, in my view, is that specifying CMOCs has replaced theory building as the dominant activity in some realist research, that it is unthinkingly applied and ends up being the focus of the work, rather than a useful device. During another discussion about mechanisms (what’s a mechanism? when is it a context?) Ray muttered to no one in particular ‘It always starts with the mechanism’, and having reflected on this myself, I think its fair to say he’s right. At one point, the conversation about mechanisms even talked about love as a mechanism…. not many conferences would do that, and I’m pleased this one did. Although, as it goes, we didn’t really want to talk about love, we only wanted to get drunk.

3. Is it really realist?
I witnessed the glee in a girls eye as she recanted to me how someone had presented their work to her and she retorted that it didn’t seem really realist. I smiled, inwardly thinking how mean! Whether or not something is really realist was talked about a lot. And to be honest, I find it frustrating, we need to get over this and just get on with doing good research. If you’re looking at mechanisms and their role in generative causation, its realist. End of. I feel continuing to try and exclude others and get one up by saying someone’s work is not really realist is boring. Call it crap research and encourage them to figure out how they could do it differently next time. But do it in a collegiate way. Let’s not turn on each other, as its off putting to new researchers to realism and feeds the in/out group mentality (I also, personally, reflected that at least that person was having trying…. who knows how hard they had slaved over it….? And if we take the idea of all knowledge being mind-independent but only partially knowable, then really, none of us can have a handle on whether or not something is really realist…. does that make me a relativist realist?! Good grief…let’s move on….)

4. Standing on the shoulders of giants
If there is one hope I think I have for the future of my work, it is that I will continue to look back, read and learn so that I may proceed better informed. David Byrne and Ray Pawson added gravitas to the conference proceedings for me because of their encyclopedic knowledge of both social science and research methodologists. I get the feeling that there are few remaining really tricky research methodology questions to square, if only we knew who had already squared them. Made me even more determined to read more, to understand the heritage of social science, to take the time to be familiar with these authors so that I can stand on their shoulders as I try to understand my research problem and my work. Made me think that the time I’ve spent reading Martyn Hammersley and understanding ethnography over the last few months has been worthwhile but is only the tip of the iceberg. Worlds to conquer, worlds to share my friends.

5. We’re doing some really fab work in Exeter and Plymouth.
I felt so very very proud to be counted amongst the Exeter and Plymouth contingent. I was proud of the breadth of the work we’re engaged in, the rigour of our methods and approach, our general collegiate way of being with each other, and how darn friendly we were to other people too! We participated well in the discussion sessions, asking questions and making observations, we presented our work, we stimulated thinking and debate, we challenged and we hosted, and we played like we’d all just met up in the year 2000. I think we have so much to offer the research community, and the best way we can do it is to continue producing good quality research. GO US! (And we’ll be showcasing more excellent work from our Universities at the Hive over the coming year…)

6. Coffee
Surprisingly perhaps, the quality of the coffee at the Barbican was adequate. But me and a few others had a lot of fun finding other places for that morning flat white: such as Here and Here. Best was the menu in fix, where you could have milf in your drink. Nice.

flat white

flat white



7. I am/ am I a believer?

This observation is a bit left field. In the same way that there was a definite desire to be seen as being a believer in realism, there were a handful of sceptics there too this last week. And their healthy scepiticism about the methods and methodology of realist approaches is, I think, needed more. In my work, it has been the criticism of my supervisors that has led to learning, not their fawning enthusiasm for my latest effort…. not that I know what that is like. But without the doubting Thomas’, I think we are in danger of group think.

For my part, I reflected that at one point I was a zealot, impressed with the heritage of scientific realism, and proclaiming its worth to any and all who would listen. I have mellowed somewhat. I would say that I still hold with the basic philosophy of science of realism – that there is a mind-independent reality, generative causation, ontological depth (there’s more than meets the eye), and retroductive reasoning. But, that’s not because of some fanatical need to be a ‘realist’, its because for me that makes sense and is good science. I am part of the realist community because they can help me do my work better by understanding my methods and thinking. And I am healthily sceptical of any methodology, methodologist or method that claims to have the whole and only truth. As in all things, we need our doubters.

8. When willing, splitting the bill at the end of a meal is not difficult.
On the last night, Sarah, Richard, me and a host of others had dinner together at a lovely Mediterranean restaurant. Anyone want sharing plates? The food was great. The wine ok. There were sharing dishes and starters, and mains only, weirdly no puddings. Or at least I didn’t get a chance to find out! And instead of the usual nightmare that is splitting the bill, everyone worked together and just put in an equal share. Regardless of whether or not they have actually eaten that much. The lack of penny pinching and nitpicking was utterly refreshing. As was the pint of Tribute that followed at the pub across the road!

9. The future of realist research …
In the final talk of the week, Gill Westhorpe encouraged us to think much more broadly as to where our research can reach. She took us through the challenges facing humanity that research has a place in meeting, and the way that realist research needs to grow and reflect the changing nature of knowledge. And she finished up by saying that there will be a realist research conference hosted by her home University in Australia next year where the emphasis will be on the impact of the work we have done, and engagement with research users. I doubt I will be able to go though as it will likely coincide with submitting my thesis, which is a shame because there will be a pre-conference meeting looking at realist economic evaluation, something we have led the field on at Exeter. [Realism and Resources – Towards more explanatory evaluation ]

10. …and the $79 weakling…
Ray gave a cracking talk, on how effectiveness of drug RCTs are the test of delicate theory building, … and drew us again to reflect on a) how the wheel of science works – theory building, theory testing and b) how developing a mechanisms library, wherein knowledge about how generative mechanisms across policy fields ‘work’ in what circumstances and why would save everyone a lot of bother in doing realist research. Hear hear, I thought. And so watch this blog for development of our own mechanism library… Presentations from the conference will be loaded to the conference website in due course, including Ray’s talk, and the link will be emailed round and added to this post when it is.

All in all it was a really good week, made some new connections, and reconnected with old ones. Drank far too much, smoked way too much and laughed a lot. And listened to some great music.

Squeaker spent most of the conference in my bag, and was still completed exhausted by the end of it.

Squeaker spent most of the conference in my bag, and was still completed exhausted by the end of it.

What is real about operational research?

Back in June, Sean Manzi gave a cracking talk about the use of realism in operational research. It was a great session, stimulated good debate, and as promised, here are the slides from his talk, and a paper they presented at conference too!

Sean’s intro for the Hive session:
As someone who is relatively new to the world of operational research I was confused by the lesser consideration of what underpins the discipline than some other disciplines. Some might say “Well it is the scientific method of course” and in a loose question avoiding way that is right, but my response would be “Which scientific method?”
In this Realist Hive session I will first briefly introduce the discipline of operational research. I will ask some difficult questions of the way operational research considers how it conducts research and why it is done that way. I will open the discussion out to everyone to ask ‘Is operational research a good example of the realist philosophy of science in action?’ We will likely need to go beyond this question to examine ‘Why should we consider philosophies of science at all? Do we need them?’ and ‘How do we decide just which philosophy should underpin our research?’
Operational research, due to its lack of strong philosophical underpinning makes a good space to examine these fundamental questions of research. My hope is that this will be a provocative discussion which enables people to reflect on their own practices and beliefs.
No prior understanding of operational research or realism is required to attend this talk! In fact I implore you to attend especially if you do not know much about these topics and even more so if you take issue with any of the questions proposed above.
This will be a lively and welcoming debate; I look forward to seeing you there.

What is real about operational research

Black boxes agency and simulation – revised

Guest Blog from Heather Ohly, PhD student, University of Central Lancashire. Support Networks for Novice Realist Researchers.

When I started my PhD in October 2014, I had barely heard of realism and was much more inclined to do a standard systematic review, having recently worked on two at the University of Exeter. So it was with some trepidation that I started to explore realist synthesis…only to discover that its core principles and approach did seem to suit my research project.

Here goes, I thought. My supervisors were enthusiastic and supportive, but none of them had ever used realist methodology. I felt like I was fumbling around in the dark for quite a while. Was this really a good idea? Can my brain think in this way?

Fortunately, I discovered that there are great support networks out there for novices like me. This blog post aims to encourage others to make the most of such networks and develop their own, to minimise and cope with the confusion that inevitably occurs when you try to learn how to do something that is at once creative, scientific, theory-driven and evidence-based.

I have benefited from three support networks so far in my journey:

RAMESES

A virtual network managed through an email distribution list that anyone can join and contribute to: https://www.jiscmail.ac.uk/RAMESES

Anyone can post a question, either about realist methodology or about your own project, and without fail you will get a thoughtful and considered response (often several) within 24 hours. This often leads to fairly high level discussions between experts from around the world, and it is fascinating to (try to) follow these and benefit from their wisdom and experience. The diverse range of contributors and questions demonstrates the scope of the realist approach and how it is gaining momentum in different disciplines. It was reassuring for me to discover that so many other people are applying it for the first time and experiencing similar uncertainties.

Frequently, some kind person will take the trouble to explain something in detail for the benefit of others. One example that I found really useful was when Peter O’Halloran from Queen’s University, Belfast outlined some of the key concepts in critical realism that have helped to shape the definitions of ‘context’ and ‘mechanism’ used in scientific realism (posted 10th March 2015). As someone without a social science background, this brief introduction to ‘social structure’ and ‘human agency’ enabled me to think about my project through a different lens. Peter also made the point that there may be “a relatively small number of mechanisms at work in relation to agency (because there is a commonality in human nature) but a myriad social structures in a given context”. This resonated with me and made the process of abstraction to middle range theory seem a little less daunting.

CARES

The University of Liverpool’s Centre for Advancement of Realist Evaluation and Synthesis organises regular realist workshops and an annual realist summer school. These events provide an opportunity to meet other realist researchers, from novice to expert, all of whom are keen to share their experiences and learn from each other. The events are reasonably priced and usually located in Liverpool or London.

When I attended the two-day realist workshop in March 2015, I was still very unsure about what I was doing with my realist synthesis and just absorbed as much information as possible. By the time I returned for the three-day realist summer school in June 2015, I had a much better understanding and felt confident enough to share my candidate programme theories. I received individual advice and feedback from three experts: Justin Jagosh, Geoff Wong and Sonia Dalkin, and came away feeling motivated and reassured.

At both events, the groups were well-attended and cross-disciplinary. I think we all found it interesting and challenging working through examples of CMO configurations from projects very different to our own. It helped me to see the logic of the realist approach shining through and also to appreciate its strengths and limitations compared to alternative research methods. A previous blog post by David Blane summarised his experience of the 2014 summer school and I would echo his sentiments about the ‘mechanisms’ of a supporting environment in which to learn – more on that later.

‘Realist club’

At the first CARES event I attended, I connected with two other researchers called Sue Mann (UCL) and Katie Shearn (Sheffield Hallam University) and we decided to stay in touch and share our realist journeys through monthly Skype chats. Our projects have some similarities and many differences, but we are all using realist methods for the first time. We take turns to discuss our own programme theories and CMO configurations, and my experience has been that I benefit just as much from discussing their projects as my own.

One thing we are all finding challenging is defining the scope and limitations of what we can do with the time and resources we have – this is very different for Katie and I as PhD students, compared to Sue who is managing a large programme of work across several developing countries – each challenging in its own way. It helps to communicate the logic of what you have done and gain fresh perspectives from people who are detached from your project. Tomorrow we are discussing Katie’s project and she has sent us a fascinating collection of slides entitled ‘conceptualising context and time’.
A small group seems to work well because we all get regular individual attention! However, we have been thinking about ways to share our discussions more widely, such as recording our Skype chats and developing a Wiki resource.

Programme theory?!

Building on David Blane’s ideas about context, mechanisms and outcomes for the CARES summer school, I’d like to propose a candidate programme theory (yes, my arm was twisted to do this):

Realist methodology is flexible, iterative and evolving (context). This means it can be confusing, especially the first time it is used (context). Support networks provide opportunities for shared learning and promotion of best practice with regards the core principles of realist methodology (mechanism: resource). This helps novice researchers to feel confident in their work and stay motivated (mechanism: response), which ultimately means they produce higher quality work and become better realist researchers (hopeful outcomes).

This is probably way too simplistic and I’m sure many people could articulate this better than me, but hey at least I’m confident to try now…so maybe confidence should have been the outcome…??

Heather Ohly, PhD student, University of Central Lancashire
Find me on Linked-In
Follow me on Twitter @heatherohly
Review registered on Prospero, Reference No. CRD42014015050. A realist review to explore how low-income pregnant women use good vouchers from the UK’s Healthy Start programme.

Gave Up

At this time of year, with the Holidays approaching, it is inevitable that thoughts turn to the year slowly tricking the last of its grains through our hands. To weigh up our successes and failures, to see how we scored. Was it a ‘good year’? Or was it one we’d much rather forget. Reviewing our Facebook timeline, do we see happy smiling faces? Summer parties at the beach? Holidays with the kids all smiling (for once)? Do we remember the great, the life-affirming, the life-changing? The conference presentation that hit the spot? Connecting with someone that made all the difference to a problematic project? Remembering all the comfort and solace of hearth and home?

Coming from a minor gothic persuasion, with an inclination for the morose, I find that more often than not, I turn towards and examine closely the failures of the year. The money misspent. The time misspent. The research grant application (rejected twice), the running I started in the summer that strangely didn’t translate into running in the winter. The many times I sat and stared at a blank document on a computer screen thinking ‘right’, but finding words scattered in my head like the slingshot of starlings in the darkening sky.

When faced with failure, what do you do? In my not-so-short-anymore life, I’ve tried to be better at failure. That is, to let it teach me something, anything, even if I know I will fail again. The cycle of failure, the brushing self down, the taking deep breath and the vowing not to fail again is one I think most are familiar with. And despite the best efforts to refocus, renew, re-aim and restart, it can, at times, be a crushingly tiring effort to see things fail and try again.

What gives me hope though is that as a realist I am scientifically interested in failure, of programme theory at least. Let me explain, if there’s one thing I think Ray Pawson has been keen on cultivating with or via the realist movement (is it a movement?) is a deep abiding love of proving the mistakenness, or ‘failures’ of our programme theories – the cultivation of a disputatious community of truth seekers; who through constant judging and pickiness nudge, cajole and beg us to be better scientists – the failure of one programme theory is often because it has failed to take into account some morsel of useful evidence… so undermined, the failed theory becomes the seed-bed of incremental progression… it means that even if we lose, we win. My failure becomes my saving grace.

One of the promises of the scientific realist approach is that it reframes failure as a gate through which we should all be jostling to get through, and then queueing up to jostle through again. If no one disagrees with what I write, or offers an alternative explanation, or picks up a point and runs with it to show its failure, then how will knowledge proceed?

Trent Reznor, lead singer of Nine Inch Nails, a 1990’s industrial goth band that I have a very fond affinity with sang a line about how ‘it took you to make me realize/it took you to make me see the light’. His song was called ‘Give Up’, because that was the consequence of finding out he’d failed. In common with much NIN work, Gave Up gives us Trent at a bitter point at the end of his rope. Seeing the light, seeing he’d got it wrong and that what he thought was really truth was not, was the mechanism that led to him giving up. Unlike Trent, I don’t (always) Give Up when I realise I got it wrong – my failure is (sometimes) transformational, because it leads me on to whatever is next. Which I hope will be better programme theory.

So this year, if my mind turns again to the failures and colossal mistakes I’ve made, I hope I will be able to see how, through these things, I am also given a second chance. And that maybe next year, I’ll get it right.
Happy Holidays.

[Disclaimer: In case my PhD supervisors are reading, I do realise that this won’t wash when it comes to the next meeting if I’ve failed to complete the tasks we discussed yesterday.]

Visiting lecture from Dr Gill Westhorp

Dear All,

The Institute of Health Research and the Realist Hive are hosting a two-day visit from one of the most experienced and globally respected practitioners of realist evaluation, Dr Gill Westhorp from Australia.  As part of her visit, Gill will give a lunchtime lecture this week:

‘Realist Evaluation: Practical Implications for Design and Methods’

Where: Veysey Lecture Theatre (Veysey Building 1st Floor, Salmon Pool Lane)

When: Thursday 9th October, 12.30 to 1.30pm.

 

Gill’s biography:

Gill is Director of Community Matters Pty Ltd, a small consultancy based in South Australia which specialises in the conduct and design of evaluations of complex services and policies, and related training/professional development, for a range of service sectors (mainly in education, health, community services, crime prevention and justice-related).  After a career in managing public services in South Australia, Gill gained a PhD in social research methods that was supervised by Prof Nick Tilley (of ‘Pawson and Tilley, 1997’) and along with Ray Pawson, Trish Greenhalgh and Geoff Wong, she is also one of the co-investigators of the influential ESRC-funded RAMESES project – to review practice and establish standards for the conduct and reporting of realist reviews and meta-narrative reviews.  More recently, she has made significant contributions to the methodological debate about how realist methods can be used in conjunction with complexity theory.  Prior to becoming a full-time evaluator, consultant and trainer Gill mainly worked as a manager of service delivery programs for young people, but her pre-consultancy roles have also included: Executive Officer of the Youth Affairs Council of South Australia; Training Development Executive of the Youth Sector Training Council of South Australia; Director of Yarrow Place Rape and Sexual Assault Service, and Manager of Early Intervention in the Crime Prevention Unit of the South Australian Attorney-General’s Department.

 

Apologies for the short notice.  Everyone welcome.  We hope you are able to attend and hear from the person who – with the exception of Ray Pawson – has applied and supported the development of realist approaches to evaluation more than anyone else.

Look forward to seeing you there,

Rob Anderson, Rebecca Hardwick, Mark Pearson

University of Liverpool Realist Methods Summer School – Guest Blog from David Blane, University of Glasgow

I know what you did last summer…..

Or rather, I know what you didn’t do.  (And thinking about what didn’t happen can be a useful feature of realist research, but more on that later!).  I know that you didn’t attend a realist methods summer school in Liverpool… because there wasn’t one.

Thankfully, however, there was this year and there will be next year.  So, for those budding realists that missed out, I’m going to share my thoughts on what it was about the summer school that worked (for whom, etc…), and hopefully convince you of the value of attending next year.

In true realist fashion, I’ll start with a C for Context.  The summer school took place in Liverpool, at another C – the recently formed Centre for Advancement of Realist Evaluation and Syntheses (CARES) at the University of Liverpool.  The venue was well suited to this sort of event, with a main meeting room (see photo) and several smaller ‘breakout’ rooms, as well as a central courtyard area for lunch and tea breaks.

It’s hard to think of a more appropriate acronym for a research centre headed by Dr Justin Jagosh  a leading figure in the application of realist methodologies to health services research. Justin really does care – not just about the development of realist research generally, but also about the individual projects that were shared, warts and all, by the participants of the summer school.  His unwavering encouragement (not to mention his patience!) was valued by everyone and can definitely be considered a crucial enabling factor for a successful summer school.

The participants were another key Contextual feature of the success of the summer school.  There were about 20 researchers from across the UK and one each from Australia and Holland.  Most were PhD students, but there were also several teams of researchers, some with many years of research experience.  Most were working in health services research, though the field of international development was also represented, and there was a good mix of realist evaluation and synthesis projects.

What about the M for Mechanism?  As Prof Rumona Dickson, the Director of the Liverpool Reviews and Implementation Group (which CARES sits within), put it, “the idea is a simple one – if you bring a bunch of people together working on similar projects, with protected time and space to work, then good things will happen”.  So, the “program theory” or mechanism is something about having protected time in a supportive environment, i.e. without phones ringing and emails pinging.  Sounds good to me.  If you take the Pawson approach to mechanisms as being “resources and the response(s) to them”, the protected time could be considered one “resource”, but there are several possible others – the sharing of experiences by colleagues, and nuggets of wisdom from Justin, for example.

And there were plenty of nuggets.  On the first day, we had an overview of the logic and key ingredients of the realist approach, and were introduced to the concept of “retroduction” (an approach to scientific inquiry described as the “spark of creativity” associated with the realist researcher!).  On day 2, we worked through examples of realist reviews to explore CMO configurations in more detail.  The questions, “what is actually going on here?” and “why did this intervention not work, for these people, in this context?” were posed, as another way of thinking through potential mechanisms.  Days 3 and 4 provided more protected time and space to work on our projects, either individually or in groups, in rooms designated as either ‘quiet’ or ‘chatty’ (I prefer ‘collaborative’).  We were also presented with the idea, frequently reinforced by Justin, that it’s perfectly normal and acceptable to move from confusion to clarity (and back again!), throughout the process of realist research.  Realist research is an evolving methodology with no prescribed set of rules to follow.  For me, the maxim “one size doesn’t fit all” applies as much to the process of realist research itself as it does to the majority of complex interventions that the methodology has been used to evaluate.  This flexibility (and inherent uncertainty) can be at once both comforting and disconcerting, and takes some time to get used to (a variable “response” to these nuggets of “resource”, perhaps?).

Ultimately, the Cs and Ms that are included in any realist research project will be shaped by the O of Outcome(s).  Indeed, many realist researchers recommend starting with your Outcomes and working backwards, when hypothesizing potential Mechanisms and their enabling or constraining Contexts.  The desired Outcome for most of the summer school participants was simply to make progress with their realist projects.  The general consensus was that this had definitely been achieved: for some, there were “breakthrough” moments; while for others, they simply left a little more confident that they were working along the right lines, and a little more comfortable with the uncertainties of the process.  For me, as the proud (if a little exhausted) father to a six week-old daughter, three nights of undisturbed sleep was perhaps my greatest Outcome!

I’d like to finish by thanking my PhD supervisors for encouraging me to attend the summer school, my funder (the Scottish Government’s Chief Scientist Office) for enabling me to attend, and my wonderful wife Miriam for allowing me to attend.  I look forward to meeting up again with many of the summer school participants, along with other realist researchers, at the 1st International Conference on Realist Approaches to Evaluation and Synthesis, to be held at the same venue in Liverpool between 27th and 30th October 2014.  In the meantime, happy retroducing!

Literature searching for realist reviews

Hello and thank you for reading my blog post. My name is Simon Briscoe and I work as an information specialist alongside Mark and Becky at PenTAG. When I tell people that I’m an information specialist I tend to get a blank look. So I usually say either that I work in health research, or that I’m a librarian who works with databases. Put the two together and you get the gist of it, which is that I search databases for literature which is then used by health researchers to write reports.

This year I’ve been working on a realist review with Mark. The review is part of a project that aims to develop a collaborative care intervention for prisoners with mental health problems, near to and after release. This was my first realist review, so I spent some time familiarising myself with the methodology required for this type of work. In doing this, it became apparent that my role as information specialist would be different to other reviews that I’ve worked on. It was fun learning a new method, and Becky thought it would be useful for me to write a blog post to share my experiences.

Most of the time, the role of the information specialist in health research is well-defined. When a research team are put together to write a report, an experienced information specialist will have a clear idea of what’s required of them: there are guidelines detailing each part of the process, from identifying search terms and selecting databases to search, to recording the results of the searches. Most reports that I contribute to are systematic reviews and require a thorough appraisal of all (or almost all) the literature on a topic in order to reach an evidence-based answer to a question.

Realist reviews are not premised on the idea that the right answer can be reached by simply assessing all the available evidence. A bit like Benedict Cumberbatch’s Sherlock Holmes, realism refuses to settle for what first appears to be the case. Becky has written about this in her blog, and in particular, she’s highlighted a couple of points which I think have an impact on the role of the information specialist: “making sense of the evidence” and “finding ‘just enough’ evidence”.

Firstly, in this post, Becky writes:

“[Realism] is not just summarising the evidence, it is making sense of it, maybe within a bigger scheme, in a way that has potential to be more applicable to decision-makers” [my emphasis].

“Summarising the evidence” is sufficient to answer some questions, particularly, where the intervention is simple to administer and the effects can be easily measured. (For example, the use of aspirin for the reduction of stroke). But some interventions are complex and the results are harder to measure. This is where realist researchers argue we need to “make sense” of the evidence, which involves identifying principles (or mechanisms, in realist language) that lie behind and explain the effectiveness of an intervention.

A second key point is that rather than assessing all the available evidence, realism seeks to

 “…achieve theoretical saturation, so that we can be pretty confident that there is ‘no more’ important evidence to capture for the particular theory we are building or testing…” [my emphasis].

Realism does not seek to find all the evidence, but just enough (i.e. “theoretical saturation”) to answer the question. “Theoretical saturation” is reached when there is “no more important evidence”, which in the context of realism means, no more mechanisms to uncover in the evidence.

It was my experience that these two elementary principles of realist reviews affected my role as an information specialist in at least three ways:

First, an information specialist should be aware that relevant mechanisms are potentially identifiable in literature outside the scope of the review.  Mechanisms are to some extent transferable between different population groups and interventions, so it’s a good tip to broaden the scope of the search. For example, the population group for the realist review I worked on was prisoners near to and after release. However, we also searched for studies on social groups with similar vulnerabilities to the prison population (e.g. people who use illicit substances), which we thought might reveal mechanisms that apply to the prison population, too.

Secondly, because the aim is to achieve theoretical saturation rather than comprehensive coverage of evidence, an information specialist should focus on specificity rather than sensitivity. Specificity and sensitivity basically mean accuracy and breadth of coverage, respectively. Ordinarily, the information specialist balances both. But because mechanisms recur in different pieces of research, the researcher is likely to become familiar with all the mechanisms before they have exhausted all the literature.

In this respect, a realist review is more straightforward for the information specialist. But a considered approach is still needed: following point one (above) the research team may want to dip into several different areas of research, so it’s important retain the focus on specificity to stop the amount of literature accumulating too much.

Continuing on this point, Becky has noted that “there will always be further evidence that would be brought to bear, to further build or refine our thinking…” This raises the issue of whether “when we ‘stop’ searching, are we making a judgement which is based more on external factors to the project (time, funding), rather than the internal factors (we’ve found it all)?” I’m not sure what the answer to this question is, except that the process of stopping will be a conversation between the information specialist and the rest of the research team: if the research team say ‘stop’, the information specialist can suggest reasons why it might be worth continuing. For example, there might be a different database that could be utilised, or search terms could be refined. Eventually, (hopefully…) an agreement will be reached.  

Thirdly, literature searching is likely to take place throughout the review process. Traditionally, an information specialist will aim to identify all the required literature usinga single search strategy at the start of the review process. This ensures transparency and enables other researchers to reproduce the same results. It also prevents the research team from biasing the results by targeting pockets of evidence. By contrast, the evidence base for a realist review will develop incrementally as mechanisms are uncovered and links are made with other areas of research (see point 1). An implication of this is that researchers can decide at any point to run additional searches. As such, an information specialist should be prepared for a higher level of involvement than for a traditional systematic review.

 (It’s also important to note that the information specialist should still aim for transparency by recording the searches, as recommended in the RAMESES publication guidelines for realist reviews).

Ray Pawson, perhaps the guru of realist reviews, has written a little about literature searching here. Much of what I’ve written is loosely based on his work, so it’s worth looking at the section titled “Searching for relevant evidence” for further guidance.

Link to PenSR PowerPoint slides http://medicine.exeter.ac.uk/pentag/workstreams/pensr/

Guest Blog – Kevin Harris, Southampton Solent University

The following is the first in an occasional series of blogs written by colleagues working with realist methods outside the University of Exeter.  Mark and I are very grateful to Kevin Harris for his contribution.  Kevin is a Senior Lecturer at the Faculty of Business, Sport and Enterprise, Southampton Solent University, and his post is a closer look at the evaluation of The Coaching Innovation Programme…. over you to Kevin.

Hi there,

My name is Kevin Harris and I am a senior lecturer and course leader in sport development and sport policy at Southampton Solent University.

When deciding to take my PhD I was keen to do something that kept me connected with the industry I used to work in (sport for social change) and create a closer bridge between academia and industry.

Around this time I had just created the Coaching Innovation Programme which in essence mobilises student led sport, physical activity and coaching projects to residents in the community of Southampton to address community needs. For example my students have been involved with researching the needs of a community and delivering their own projects – things like combining physical literacy with maths and science in the curriculum to promote learning, all the way through to delivering sports based sessions to offer resources for homeless people. The students work with industry practitioners to address niche areas and respond to those needs.

The Coaching innovation Programme now mobilises around 30-40 of these projects every year now so this is a massive contribution to the sport development and physical activity landscape. One of the things which required and still required addressing was the unanswered questions surrounding evidence of the projects. The students would be able to refer to satisfying experiences for them, their participants and practitioners yet they would struggle to evidence the impact of their project, but even more importantly how and why their project achieved its outcomes.

This is where I saw a fantastic opportunity to apply my PhD to the CIP and come up with a monitoring and evaluation framework which would enable the students and practitioners to make sense of what they learnt from their programme. This would also address some of the issues in my field that surrounds practitioners engaging with monitoring and evaluation (M and E) which is relatively poorly carried out (Coalter, 2007). In essence I wanted to bring M and E practice closer to the practitioners and more embedded in their work.

So, then… Over the last two years I have spent a considerable amount of time reviewing certain approaches to evidence, M and E. This has been and felt like a round the world trip in itself. There is an ocean of literature and approaches out there. Given the nature of the complexity of the interventions my students are implementing it was no surprise like many of us that I found myself exploring the philosophical roots of critical realism and the emphasis of programme theory. This was after tinkering with other aspects of programme theory such as logic models whose operational logic failed to really capture what we were trying to do.

The realistic angle of programme theory and evaluation really started to take hold as it really fitted with the nature of the interventions around producing multiple outcomes for different people in different contexts and firing mechanisms. around conceptual logic. This led me to produce two models focusing on the formation of programme theory and monitoring and evaluation. The first which would take students through the steps of developing their own candidate programme theory, borrowing the principles of Pawson and Tilley’s (1997)s realist approach and combining with other aspects of operational logic by anatomising the programme Funnel and Rogers (2011). By this I simply mean outlining and breaking down the programme strategy into its components – eg activities, inputs, outputs. Made up of three stages the first model would firstly map the field and establish the context whereby students would carry out a situational analysis such as looking at the geography of the area, the needs of the participants and contact with stakeholders. This would then inform stage 2 which would enable the students and practitioners / additional stakeholders to anatomise their programme (light touch logic model) and establish the key outcomes / subsidiary theories which constitute their project.

These would usually constitute ‘if’ and ‘then’ assumptions which would lead to stage 3 which goes on to explain how and why those outcomes may come about. It is this stage I suppose that really captures the realistic lens of conjecturing CMO configurations getting to heart of explaining how and why and for whom the outcomes might work. By the end of these stages the students and practitioners have themselves a robust and rigorously constructed candidate programme theory.

Of course the next step is to then test the theory through project delivery and M and E. This is where model 2 comes in which takes the students through 6 key stages of programme evaluation within a light touch realistic approach. For example , part 1 consists of reconceptualising and refreshing programme theory, part 2 consists of developing and framing evaluation questions within the realistic lens (eg what works for whom in what circumstances and why). These questions are particularly constructed against each CMO conjecture but NOT all of them. Like Pawson 2012 states, steady your fire! Part 3 and 4 involves establishing methodological competency and agreement of methods to answer questions and part 5 covers data analysis with 6 covering the reporting of data. These 6 parts have been produced using participatory evaluation approach which has involved the students throughout via cooperative enquiry and training / facilitation (Greenwood and  Levin, 2007; Fetterman, 2005) from myself. The aim was to train / facilitate the student practitioners to be able to carry out realistic techniques for their M and E. Thus, I am testing the model.

At this stage I have just completed (nearly) my first pilot of the model by working with 6 student projects. The workshops have been delivered and supported by action learning sets engaging in discussion with the students and progressing their M and E. My aim is to reach Mphil transfer this summer by exploring the utility of the model(s), the extent of students engagement and praxis in M and E.

Key Challenges:

 

Firstly , teaching and stimulating interest in the area of M and E is hard, especially for young practitioners! This is particularly hard given the language used with Realist .Evalutation and in many respects students simply do not get it. The academic discourse in which it resides presents a challenge for unlocking its potential for people working on the ground. The nature of the projects themselves and the time it takes to employ a realistic evaluation has also been challenging for the students. For example, how do you uncover the generative mechanisms for change of 9 year old children? All this in addition to the many other priorities of university work load and life for the students I have been working with.

In addition, the conceptual obstacles that realsitic evaluation presents is also a major challenge. Mid range theory, Demi regularities, conjectures, mechanisms and theory riddle the literature on this. This creates major obstacles for practitioners.I don’t think that it’s the different way of seeing’ that realist approaches advocate  (eg why things work) that cause the problem. It is more about the language and understanding of how to identify mechanisms of change. I initially wondered when developing the model whether the conceptual nature of the realistic approach would be suitable for practitioners. My initial thoughts were that it should given that such an insightful method should not be constrained to academic discourses. Having posted this in the RAMESES mailing list, thankfully the guru herself Gill Westhorp stated that it’s entirely appropriate for practitioners to be introduced to the approach. Why shouldn’t practitioners engage with such techniques?

The key Gill said was to communicate it in a way that does not confuse the language and can meet the contextual needs of practitioners and how they engage. This is something I have tried to follow. By far, the hardest thing to grasp is the ever elusive programme mechanism. Having attendee the realist training workshop in Liverpool this March Justin Jagosh did a great job in explaining ways to identify a mechanism. In that programme activities within out candidate theory provide resources and opportunities and those resources and opportunities produce reasoning in the minds of the programme users. The key is identifying what resources are they … what opportunities are they and how might the participants respond to them. These are the ingredients which then produce the mechanism for certain people (for whom).

I am really keen to benchmark with people on this. I really value the realsitic approach yet promoting it within a simplistic and pragmatic way for students and practitioners is a key area for discussion.

Links to resources:

  1. The Coaching Innovation Programme  http://www.youtube.com/watch?v=La1vUuMNoMU
  2. Kevin Harris profile  http://www.solent.ac.uk/faculties/fbse/staff-profiles/harris-kevin.aspx

References:

 

Coalter, F. (2007). A wider Social role for sport.  Oxon: Routledge

Funnell, S. and Rogers, P 2011. Purposeful programme theory.  Affective use of theory of change and logic models . San Francisco, USA: Jossey Bass

Pawson, R. (2003). Nothing as Practical as a Good Theory.  Evaluation 2003 9: 471

Pawson, R and Tilley, N. (1997) Realistic Evaluation. London: Sage

Greenwood, DJ and Levin, M. (2007) Introduction to Action Research 2nd edition – social research for social change. California: Sage