Ongoing trials and tribulations of applying realist methods in primary research and research synthesis – aiming to promote understanding of the method and give insight into methodological developments.
Back in October, Lisa Burrows gave a great talk at the Realist Hive about her PhD, and the process she’s been following to do her research.
Lisa’s PhD is looking at how Dementia Cafe’s work (or rather What is the effect of memory cafes as an intervention for people with dementia and their carers?). She talked about her personal motivations for looking at Dementia Cafe’s, her experience in running one and the need to develop an evidence base for dementia cafes. They are an increasingly popular community based response for people living with dementia and their families and supporters, but little is understood about how they work, for whom, in what circumstances and why.
Lisa talked passionately about why she chose a realist approach (she’s doing a review and an evaluation for her PhD), saying that it resonated strongly with her own experience that nothing works the same in all places, at all times, for everyone, and that it is important to not only understand whether or not something is effective, but also why it is. She took us through her realist review in some detail, explaining the way she went from mapping the territory of dementia cafes, to developing programme theories and “if then” propositions.
It was a useful practice based session where we listened and asked questions, and then had a crack at doing our own programme theory surfacing from a short vignette she shared.
The slides from the presentation will be added below in due course, plus the vignette at the end.
Thanks to Lisa for presenting so clearly on her work and to everyone that came along.
I’m on the train coming back from CARES’s 2nd Annual Conference, and as this Great Western Railway train speeds through the countryside on its 2.5 hr journey back to the Shire, I have been reflecting on the conference, and wanted to share a few observations – for those who attended, and those who didn’t. The tracks are courtesy of my 19 year old self.
(CARES, in case you don’t know, stands for the Centre for the Advancement of Realist Evaluation and Synthesis. Their website is here. The best thing about their conferences, and actually any conference, is when the ‘greats’ of the method turn up and join in.)
1. London is a very big place Mr Shadrack, a very big place. A man could lose himself in London, lose himself. Lose himself in London.
I lived in London between 1994 and 2000, the heyday years of Britpop, and in my time at Goldsmiths’ College and the few years afterwards, I prided myself on getting to know the city well. I took random buses to different places I’d read about or heard of just to see what they were like. Getting lost was something that country folk did, and I prided myself on my sense of direction. Nigh on twenty years later, we now have the mobile telephone and the google map. Which would make you think that exploring a different part of the city would be easier. I don’t know the streets around the Barbican very well at all, but as I had my trusty app, I thought I’d be ok. How wrong was I? I thought I was going north when I was going south. East when west. And so on. And it was only when I stopped using the map that I started to find my way. Conclusions may be drawn from that at your leisure…
2. It always starts with the mechanism
If you’ve been around the Scientific Realism community for a while, you’ll have come across what feels like an interminable debate about CMOCs. CMOCs stands for ‘Context’, ‘Mechanism’, ‘Outcome’ configuration, and it is a socially constructed heuristic to support researchers in the process of developing programme theories. There has been much debate about the utility and necessity of ‘specifying’ CMOCs, as well as ‘how’ to do them. This dates back to the mists of time when I think Ray was trying to offer a useful tool to help researchers get a handle on ‘what is a programme theory’. The problem, in my view, is that specifying CMOCs has replaced theory building as the dominant activity in some realist research, that it is unthinkingly applied and ends up being the focus of the work, rather than a useful device. During another discussion about mechanisms (what’s a mechanism? when is it a context?) Ray muttered to no one in particular ‘It always starts with the mechanism’, and having reflected on this myself, I think its fair to say he’s right. At one point, the conversation about mechanisms even talked about love as a mechanism…. not many conferences would do that, and I’m pleased this one did. Although, as it goes, we didn’t really want to talk about love, we only wanted to get drunk.
3. Is it really realist?
I witnessed the glee in a girls eye as she recanted to me how someone had presented their work to her and she retorted that it didn’t seem really realist. I smiled, inwardly thinking how mean! Whether or not something is really realist was talked about a lot. And to be honest, I find it frustrating, we need to get over this and just get on with doing good research. If you’re looking at mechanisms and their role in generative causation, its realist. End of. I feel continuing to try and exclude others and get one up by saying someone’s work is not really realist is boring. Call it crap research and encourage them to figure out how they could do it differently next time. But do it in a collegiate way. Let’s not turn on each other, as its off putting to new researchers to realism and feeds the in/out group mentality (I also, personally, reflected that at least that person was having trying…. who knows how hard they had slaved over it….? And if we take the idea of all knowledge being mind-independent but only partially knowable, then really, none of us can have a handle on whether or not something is really realist…. does that make me a relativist realist?! Good grief…let’s move on….)
4. Standing on the shoulders of giants
If there is one hope I think I have for the future of my work, it is that I will continue to look back, read and learn so that I may proceed better informed. David Byrne and Ray Pawson added gravitas to the conference proceedings for me because of their encyclopedic knowledge of both social science and research methodologists. I get the feeling that there are few remaining really tricky research methodology questions to square, if only we knew who had already squared them. Made me even more determined to read more, to understand the heritage of social science, to take the time to be familiar with these authors so that I can stand on their shoulders as I try to understand my research problem and my work. Made me think that the time I’ve spent reading Martyn Hammersley and understanding ethnography over the last few months has been worthwhile but is only the tip of the iceberg. Worlds to conquer, worlds to share my friends.
5. We’re doing some really fab work in Exeter and Plymouth.
I felt so very very proud to be counted amongst the Exeter and Plymouth contingent. I was proud of the breadth of the work we’re engaged in, the rigour of our methods and approach, our general collegiate way of being with each other, and how darn friendly we were to other people too! We participated well in the discussion sessions, asking questions and making observations, we presented our work, we stimulated thinking and debate, we challenged and we hosted, and we played like we’d all just met up in the year 2000. I think we have so much to offer the research community, and the best way we can do it is to continue producing good quality research. GO US! (And we’ll be showcasing more excellent work from our Universities at the Hive over the coming year…)
Surprisingly perhaps, the quality of the coffee at the Barbican was adequate. But me and a few others had a lot of fun finding other places for that morning flat white: such as Here and Here. Best was the menu in fix, where you could have milf in your drink. Nice.
7. I am/ am I a believer?
This observation is a bit left field. In the same way that there was a definite desire to be seen as being a believer in realism, there were a handful of sceptics there too this last week. And their healthy scepiticism about the methods and methodology of realist approaches is, I think, needed more. In my work, it has been the criticism of my supervisors that has led to learning, not their fawning enthusiasm for my latest effort…. not that I know what that is like. But without the doubting Thomas’, I think we are in danger of group think.
For my part, I reflected that at one point I was a zealot, impressed with the heritage of scientific realism, and proclaiming its worth to any and all who would listen. I have mellowed somewhat. I would say that I still hold with the basic philosophy of science of realism – that there is a mind-independent reality, generative causation, ontological depth (there’s more than meets the eye), and retroductive reasoning. But, that’s not because of some fanatical need to be a ‘realist’, its because for me that makes sense and is good science. I am part of the realist community because they can help me do my work better by understanding my methods and thinking. And I am healthily sceptical of any methodology, methodologist or method that claims to have the whole and only truth. As in all things, we need our doubters.
8. When willing, splitting the bill at the end of a meal is not difficult.
On the last night, Sarah, Richard, me and a host of others had dinner together at a lovely Mediterranean restaurant. The food was great. The wine ok. There were sharing dishes and starters, and mains only, weirdly no puddings. Or at least I didn’t get a chance to find out! And instead of the usual nightmare that is splitting the bill, everyone worked together and just put in an equal share. Regardless of whether or not they have actually eaten that much. The lack of penny pinching and nitpicking was utterly refreshing. As was the pint of Tribute that followed at the pub across the road!
9. The future of realist research …
In the final talk of the week, Gill Westhorpe encouraged us to think much more broadly as to where our research can reach. She took us through the challenges facing humanity that research has a place in meeting, and the way that realist research needs to grow and reflect the changing nature of knowledge. And she finished up by saying that there will be a realist research conference hosted by her home University in Australia next year where the emphasis will be on the impact of the work we have done, and engagement with research users. I doubt I will be able to go though as it will likely coincide with submitting my thesis, which is a shame because there will be a pre-conference meeting looking at realist economic evaluation, something we have led the field on at Exeter. [Realism and Resources – Towards more explanatory evaluation ]
10. …and the $79 weakling…
Ray gave a cracking talk, on how effectiveness of drug RCTs are the test of delicate theory building, … and drew us again to reflect on a) how the wheel of science works – theory building, theory testing and b) how developing a mechanisms library, wherein knowledge about how generative mechanisms across policy fields ‘work’ in what circumstances and why would save everyone a lot of bother in doing realist research. Hear hear, I thought. And so watch this blog for development of our own mechanism library… Presentations from the conference will be loaded to the conference website in due course, including Ray’s talk, and the link will be emailed round and added to this post when it is.
All in all it was a really good week, made some new connections, and reconnected with old ones. Drank far too much, smoked way too much and laughed a lot. And listened to some great music.
Squeaker spent most of the conference in my bag, and was still completed exhausted by the end of it.
Back in June, Sean Manzi gave a cracking talk about the use of realism in operational research. It was a great session, stimulated good debate, and as promised, here are the slides from his talk, and a paper they presented at conference too!
Sean’s intro for the Hive session: As someone who is relatively new to the world of operational research I was confused by the lesser consideration of what underpins the discipline than some other disciplines. Some might say “Well it is the scientific method of course” and in a loose question avoiding way that is right, but my response would be “Which scientific method?”
In this Realist Hive session I will first briefly introduce the discipline of operational research. I will ask some difficult questions of the way operational research considers how it conducts research and why it is done that way. I will open the discussion out to everyone to ask ‘Is operational research a good example of the realist philosophy of science in action?’ We will likely need to go beyond this question to examine ‘Why should we consider philosophies of science at all? Do we need them?’ and ‘How do we decide just which philosophy should underpin our research?’
Operational research, due to its lack of strong philosophical underpinning makes a good space to examine these fundamental questions of research. My hope is that this will be a provocative discussion which enables people to reflect on their own practices and beliefs.
No prior understanding of operational research or realism is required to attend this talk! In fact I implore you to attend especially if you do not know much about these topics and even more so if you take issue with any of the questions proposed above.
This will be a lively and welcoming debate; I look forward to seeing you there.
When I started my PhD in October 2014, I had barely heard of realism and was much more inclined to do a standard systematic review, having recently worked on two at the University of Exeter. So it was with some trepidation that I started to explore realist synthesis…only to discover that its core principles and approach did seem to suit my research project.
Here goes, I thought. My supervisors were enthusiastic and supportive, but none of them had ever used realist methodology. I felt like I was fumbling around in the dark for quite a while. Was this really a good idea? Can my brain think in this way?
Fortunately, I discovered that there are great support networks out there for novices like me. This blog post aims to encourage others to make the most of such networks and develop their own, to minimise and cope with the confusion that inevitably occurs when you try to learn how to do something that is at once creative, scientific, theory-driven and evidence-based.
I have benefited from three support networks so far in my journey:
Anyone can post a question, either about realist methodology or about your own project, and without fail you will get a thoughtful and considered response (often several) within 24 hours. This often leads to fairly high level discussions between experts from around the world, and it is fascinating to (try to) follow these and benefit from their wisdom and experience. The diverse range of contributors and questions demonstrates the scope of the realist approach and how it is gaining momentum in different disciplines. It was reassuring for me to discover that so many other people are applying it for the first time and experiencing similar uncertainties.
Frequently, some kind person will take the trouble to explain something in detail for the benefit of others. One example that I found really useful was when Peter O’Halloran from Queen’s University, Belfast outlined some of the key concepts in critical realism that have helped to shape the definitions of ‘context’ and ‘mechanism’ used in scientific realism (posted 10th March 2015). As someone without a social science background, this brief introduction to ‘social structure’ and ‘human agency’ enabled me to think about my project through a different lens. Peter also made the point that there may be “a relatively small number of mechanisms at work in relation to agency (because there is a commonality in human nature) but a myriad social structures in a given context”. This resonated with me and made the process of abstraction to middle range theory seem a little less daunting.
The University of Liverpool’s Centre for Advancement of Realist Evaluation and Synthesis organises regular realist workshops and an annual realist summer school. These events provide an opportunity to meet other realist researchers, from novice to expert, all of whom are keen to share their experiences and learn from each other. The events are reasonably priced and usually located in Liverpool or London.
When I attended the two-day realist workshop in March 2015, I was still very unsure about what I was doing with my realist synthesis and just absorbed as much information as possible. By the time I returned for the three-day realist summer school in June 2015, I had a much better understanding and felt confident enough to share my candidate programme theories. I received individual advice and feedback from three experts: Justin Jagosh, Geoff Wong and Sonia Dalkin, and came away feeling motivated and reassured.
At both events, the groups were well-attended and cross-disciplinary. I think we all found it interesting and challenging working through examples of CMO configurations from projects very different to our own. It helped me to see the logic of the realist approach shining through and also to appreciate its strengths and limitations compared to alternative research methods. A previous blog post by David Blane summarised his experience of the 2014 summer school and I would echo his sentiments about the ‘mechanisms’ of a supporting environment in which to learn – more on that later.
At the first CARES event I attended, I connected with two other researchers called Sue Mann (UCL) and Katie Shearn (Sheffield Hallam University) and we decided to stay in touch and share our realist journeys through monthly Skype chats. Our projects have some similarities and many differences, but we are all using realist methods for the first time. We take turns to discuss our own programme theories and CMO configurations, and my experience has been that I benefit just as much from discussing their projects as my own.
One thing we are all finding challenging is defining the scope and limitations of what we can do with the time and resources we have – this is very different for Katie and I as PhD students, compared to Sue who is managing a large programme of work across several developing countries – each challenging in its own way. It helps to communicate the logic of what you have done and gain fresh perspectives from people who are detached from your project. Tomorrow we are discussing Katie’s project and she has sent us a fascinating collection of slides entitled ‘conceptualising context and time’.
A small group seems to work well because we all get regular individual attention! However, we have been thinking about ways to share our discussions more widely, such as recording our Skype chats and developing a Wiki resource.
Building on David Blane’s ideas about context, mechanisms and outcomes for the CARES summer school, I’d like to propose a candidate programme theory (yes, my arm was twisted to do this):
Realist methodology is flexible, iterative and evolving (context). This means it can be confusing, especially the first time it is used (context). Support networks provide opportunities for shared learning and promotion of best practice with regards the core principles of realist methodology (mechanism: resource). This helps novice researchers to feel confident in their work and stay motivated (mechanism: response), which ultimately means they produce higher quality work and become better realist researchers (hopeful outcomes).
This is probably way too simplistic and I’m sure many people could articulate this better than me, but hey at least I’m confident to try now…so maybe confidence should have been the outcome…??
At this time of year, with the Holidays approaching, it is inevitable that thoughts turn to the year slowly tricking the last of its grains through our hands. To weigh up our successes and failures, to see how we scored. Was it a ‘good year’? Or was it one we’d much rather forget. Reviewing our Facebook timeline, do we see happy smiling faces? Summer parties at the beach? Holidays with the kids all smiling (for once)? Do we remember the great, the life-affirming, the life-changing? The conference presentation that hit the spot? Connecting with someone that made all the difference to a problematic project? Remembering all the comfort and solace of hearth and home?
Coming from a minor gothic persuasion, with an inclination for the morose, I find that more often than not, I turn towards and examine closely the failures of the year. The money misspent. The time misspent. The research grant application (rejected twice), the running I started in the summer that strangely didn’t translate into running in the winter. The many times I sat and stared at a blank document on a computer screen thinking ‘right’, but finding words scattered in my head like the slingshot of starlings in the darkening sky.
When faced with failure, what do you do? In my not-so-short-anymore life, I’ve tried to be better at failure. That is, to let it teach me something, anything, even if I know I will fail again. The cycle of failure, the brushing self down, the taking deep breath and the vowing not to fail again is one I think most are familiar with. And despite the best efforts to refocus, renew, re-aim and restart, it can, at times, be a crushingly tiring effort to see things fail and try again.
What gives me hope though is that as a realist I am scientifically interested in failure, of programme theory at least. Let me explain, if there’s one thing I think Ray Pawson has been keen on cultivating with or via the realist movement (is it a movement?) is a deep abiding love of proving the mistakenness, or ‘failures’ of our programme theories – the cultivation of a disputatious community of truth seekers; who through constant judging and pickiness nudge, cajole and beg us to be better scientists – the failure of one programme theory is often because it has failed to take into account some morsel of useful evidence… so undermined, the failed theory becomes the seed-bed of incremental progression… it means that even if we lose, we win. My failure becomes my saving grace.
One of the promises of the scientific realist approach is that it reframes failure as a gate through which we should all be jostling to get through, and then queueing up to jostle through again. If no one disagrees with what I write, or offers an alternative explanation, or picks up a point and runs with it to show its failure, then how will knowledge proceed?
Trent Reznor, lead singer of Nine Inch Nails, a 1990’s industrial goth band that I have a very fond affinity with sang a line about how ‘it took you to make me realize/it took you to make me see the light’. His song was called ‘Give Up’, because that was the consequence of finding out he’d failed. In common with much NIN work, Gave Up gives us Trent at a bitter point at the end of his rope. Seeing the light, seeing he’d got it wrong and that what he thought was really truth was not, was the mechanism that led to him giving up. Unlike Trent, I don’t (always) Give Up when I realise I got it wrong – my failure is (sometimes) transformational, because it leads me on to whatever is next. Which I hope will be better programme theory.
So this year, if my mind turns again to the failures and colossal mistakes I’ve made, I hope I will be able to see how, through these things, I am also given a second chance. And that maybe next year, I’ll get it right.
[Disclaimer: In case my PhD supervisors are reading, I do realise that this won’t wash when it comes to the next meeting if I’ve failed to complete the tasks we discussed yesterday.]
The Institute of Health Research and the Realist Hive are hosting a two-day visit from one of the most experienced and globally respected practitioners of realist evaluation, Dr Gill Westhorp from Australia. As part of her visit, Gill will give a lunchtime lecture this week:
‘Realist Evaluation: Practical Implications for Design and Methods’
Where: Veysey Lecture Theatre (Veysey Building 1st Floor, Salmon Pool Lane)
When: Thursday 9th October, 12.30 to 1.30pm.
Gill is Director of Community Matters Pty Ltd, a small consultancy based in South Australia which specialises in the conduct and design of evaluations of complex services and policies, and related training/professional development, for a range of service sectors (mainly in education, health, community services, crime prevention and justice-related). After a career in managing public services in South Australia, Gill gained a PhD in social research methods that was supervised by Prof Nick Tilley (of ‘Pawson and Tilley, 1997’) and along with Ray Pawson, Trish Greenhalgh and Geoff Wong, she is also one of the co-investigators of the influential ESRC-funded RAMESES project – to review practice and establish standards for the conduct and reporting of realist reviews and meta-narrative reviews. More recently, she has made significant contributions to the methodological debate about how realist methods can be used in conjunction with complexity theory. Prior to becoming a full-time evaluator, consultant and trainer Gill mainly worked as a manager of service delivery programs for young people, but her pre-consultancy roles have also included: Executive Officer of the Youth Affairs Council of South Australia; Training Development Executive of the Youth Sector Training Council of South Australia; Director of Yarrow Place Rape and Sexual Assault Service, and Manager of Early Intervention in the Crime Prevention Unit of the South Australian Attorney-General’s Department.
Apologies for the short notice. Everyone welcome. We hope you are able to attend and hear from the person who – with the exception of Ray Pawson – has applied and supported the development of realist approaches to evaluation more than anyone else.
Back in March I emailed colleagues to find out how they would define health services research. I’d posted a link to a recent Hive Blog to my facebook page, and whilst my friends were understandably incredibly impressed with how erudite and cleverly I made connections between Taylor Swift songs and realist synthesis, they were also at a bit of a loss about what kind of research I do… this was the second time that someone had asked me to more closely define a) what is HSR, as well as b) realist approaches in a nugget/nutshell, so I thought it was about time I wrote something.
I had some great responses from colleagues (see the word cloud!), and here are some of the highlights – starting in the plainest of English, travelling to the kinds of definitions which I think need further translation and explanation…
Thinking wisely, asking questions and weighing the evidence in order to understand and help decision making by those that work in health and care services.
We look at things like how people access health and what treatments they may receive and how access, treatment and practices could be better in terms of costs, how effective they are, how accessible they are and how to make them actually happen in the real world.
The use of systematic methods to understand the workings of an aspect of health services, how such health services deliver health-related and other outcomes and how such services may be altered to improve the delivery of such outcomes
What is HSR? I’ve always wondered… and for a long time I had absoluuutley no idea and just hoped no-one would ask me… to be honest none of my daily out-of-work acquaintances have the even remotest idea what I do apart from be Betty and Danny’s Mummy and that I drive off to work in that far off land what is The University so something like ‘it’s research about health services kind of thing innit?’ is usually enough to get the glazed look and realise that you’ve said enough! I feel a bit more grown up now (since they put Senior in my title) and like I should have a better answer… but maybe the simple is always better – this is a bit text book-eze. How about – ‘HSR involves researchers from multiple disciplines aiming to find out the best ways to deliver, manage and organise the most effective and cost-effective health services and improve the quality of care.’?
Health Service Research involves “informing the decision on how to deliver the Right intervention, at the Right time to the Right Patient, in the Right location, by the Right provider, with the Right Outcomes for the Right cost “
The integration of epidemiologic, sociological, economic and other analytic sciences in the study of health services. Health services research is usually concerned with relationships between NEED, DEMAND, supply, use, and OUTCOME of health services. The aim of health services research is evaluation: several components of evaluative health services research are distinguished, viz: Evaluation of structure…., of process…., of output…., of outcome….
Health services research is a multidisciplinary field of inquiry, both basic and applied, that examines the use, costs, quality, accessibility, delivery, organisation, financing, and outcomes of health care services to increase knowledge and understanding of the structure, processes, and effects of health services for individuals and populations.
So what about you? If you’re down the pub talking to friends (or strangers….), and somebody asks you what you do, what do you say HSR is?
[Obliquely opaque film reference alert… Scarlett Johanson’s recent role as an alien in Under the Skin is creepy and compelling. I particularly like how she approaches asking people about themselves…. so maybe just be wary if someone follows up asking you about HSR, by asking you if you live alone…..]
Or rather, I know what you didn’t do. (And thinking about what didn’t happen can be a useful feature of realist research, but more on that later!). I know that you didn’t attend a realist methods summer school in Liverpool… because there wasn’t one.
Thankfully, however, there was this year and there will be next year. So, for those budding realists that missed out, I’m going to share my thoughts on what it was about the summer school that worked (for whom, etc…), and hopefully convince you of the value of attending next year.
In true realist fashion, I’ll start with a C for Context. The summer school took place in Liverpool, at another C – the recently formed Centre for Advancement of Realist Evaluation and Syntheses (CARES) at the University of Liverpool. The venue was well suited to this sort of event, with a main meeting room (see photo) and several smaller ‘breakout’ rooms, as well as a central courtyard area for lunch and tea breaks.
It’s hard to think of a more appropriate acronym for a research centre headed by Dr Justin Jagosh a leading figure in the application of realist methodologies to health services research. Justin really does care – not just about the development of realist research generally, but also about the individual projects that were shared, warts and all, by the participants of the summer school. His unwavering encouragement (not to mention his patience!) was valued by everyone and can definitely be considered a crucial enabling factor for a successful summer school.
The participants were another key Contextual feature of the success of the summer school. There were about 20 researchers from across the UK and one each from Australia and Holland. Most were PhD students, but there were also several teams of researchers, some with many years of research experience. Most were working in health services research, though the field of international development was also represented, and there was a good mix of realist evaluation and synthesis projects.
What about the M for Mechanism? As Prof Rumona Dickson, the Director of the Liverpool Reviews and Implementation Group (which CARES sits within), put it, “the idea is a simple one – if you bring a bunch of people together working on similar projects, with protected time and space to work, then good things will happen”. So, the “program theory” or mechanism is something about having protected time in a supportive environment, i.e. without phones ringing and emails pinging. Sounds good to me. If you take the Pawson approach to mechanisms as being “resources and the response(s) to them”, the protected time could be considered one “resource”, but there are several possible others – the sharing of experiences by colleagues, and nuggets of wisdom from Justin, for example.
And there were plenty of nuggets. On the first day, we had an overview of the logic and key ingredients of the realist approach, and were introduced to the concept of “retroduction” (an approach to scientific inquiry described as the “spark of creativity” associated with the realist researcher!). On day 2, we worked through examples of realist reviews to explore CMO configurations in more detail. The questions, “what is actually going on here?” and “why did this intervention not work, for these people, in this context?” were posed, as another way of thinking through potential mechanisms. Days 3 and 4 provided more protected time and space to work on our projects, either individually or in groups, in rooms designated as either ‘quiet’ or ‘chatty’ (I prefer ‘collaborative’). We were also presented with the idea, frequently reinforced by Justin, that it’s perfectly normal and acceptable to move from confusion to clarity (and back again!), throughout the process of realist research. Realist research is an evolving methodology with no prescribed set of rules to follow. For me, the maxim “one size doesn’t fit all” applies as much to the process of realist research itself as it does to the majority of complex interventions that the methodology has been used to evaluate. This flexibility (and inherent uncertainty) can be at once both comforting and disconcerting, and takes some time to get used to (a variable “response” to these nuggets of “resource”, perhaps?).
Ultimately, the Cs and Ms that are included in any realist research project will be shaped by the O of Outcome(s). Indeed, many realist researchers recommend starting with your Outcomes and working backwards, when hypothesizing potential Mechanisms and their enabling or constraining Contexts. The desired Outcome for most of the summer school participants was simply to make progress with their realist projects. The general consensus was that this had definitely been achieved: for some, there were “breakthrough” moments; while for others, they simply left a little more confident that they were working along the right lines, and a little more comfortable with the uncertainties of the process. For me, as the proud (if a little exhausted) father to a six week-old daughter, three nights of undisturbed sleep was perhaps my greatest Outcome!
I’d like to finish by thanking my PhD supervisors for encouraging me to attend the summer school, my funder (the Scottish Government’s Chief Scientist Office) for enabling me to attend, and my wonderful wife Miriam for allowing me to attend. I look forward to meeting up again with many of the summer school participants, along with other realist researchers, at the 1st International Conference on Realist Approaches to Evaluation and Synthesis, to be held at the same venue in Liverpool between 27th and 30th October 2014. In the meantime, happy retroducing!
Hello and thank you for reading my blog post. My name is Simon Briscoe and I work as an information specialist alongside Mark and Becky at PenTAG. When I tell people that I’m an information specialist I tend to get a blank look. So I usually say either that I work in health research, or that I’m a librarian who works with databases. Put the two together and you get the gist of it, which is that I search databases for literature which is then used by health researchers to write reports.
This year I’ve been working on a realist review with Mark. The review is part of a project that aims to develop a collaborative care intervention for prisoners with mental health problems, near to and after release. This was my first realist review, so I spent some time familiarising myself with the methodology required for this type of work. In doing this, it became apparent that my role as information specialist would be different to other reviews that I’ve worked on. It was fun learning a new method, and Becky thought it would be useful for me to write a blog post to share my experiences.
Most of the time, the role of the information specialist in health research is well-defined. When a research team are put together to write a report, an experienced information specialist will have a clear idea of what’s required of them: there are guidelines detailing each part of the process, from identifying search terms and selecting databases to search, to recording the results of the searches. Most reports that I contribute to are systematic reviews and require a thorough appraisal of all (or almost all) the literature on a topic in order to reach an evidence-based answer to a question.
Realist reviews are not premised on the idea that the right answer can be reached by simply assessing all the available evidence. A bit like Benedict Cumberbatch’s Sherlock Holmes, realism refuses to settle for what first appears to be the case. Becky has written about this in her blog, and in particular, she’s highlighted a couple of points which I think have an impact on the role of the information specialist: “making sense of the evidence” and “finding ‘just enough’ evidence”.
“[Realism] is not just summarising the evidence, it is making sense of it, maybe within a bigger scheme, in a way that has potential to be more applicable to decision-makers” [my emphasis].
“Summarising the evidence” is sufficient to answer some questions, particularly, where the intervention is simple to administer and the effects can be easily measured. (For example, the use of aspirin for the reduction of stroke). But some interventions are complex and the results are harder to measure. This is where realist researchers argue we need to “make sense” of the evidence, which involves identifying principles (or mechanisms, in realist language) that lie behind and explain the effectiveness of an intervention.
A second key point is that rather than assessing all the available evidence, realism seeks to
“…achieve theoretical saturation, so that we can be pretty confident that there is ‘no more’ important evidence to capture for the particular theory we are building or testing…” [my emphasis].
Realism does not seek to find all the evidence, but just enough (i.e. “theoretical saturation”) to answer the question. “Theoretical saturation” is reached when there is “no more important evidence”, which in the context of realism means, no more mechanisms to uncover in the evidence.
It was my experience that these two elementary principles of realist reviews affected my role as an information specialist in at least three ways:
First, an information specialist should be aware that relevant mechanisms are potentially identifiable in literature outside the scope of the review. Mechanisms are to some extent transferable between different population groups and interventions, so it’s a good tip to broaden the scope of the search. For example, the population group for the realist review I worked on was prisoners near to and after release. However, we also searched for studies on social groups with similar vulnerabilities to the prison population (e.g. people who use illicit substances), which we thought might reveal mechanisms that apply to the prison population, too.
Secondly, because the aim is to achieve theoretical saturation rather than comprehensive coverage of evidence, an information specialist should focus on specificity rather than sensitivity. Specificity and sensitivity basically mean accuracy and breadth of coverage, respectively. Ordinarily, the information specialist balances both. But because mechanisms recur in different pieces of research, the researcher is likely to become familiar with all the mechanisms before they have exhausted all the literature.
In this respect, a realist review is more straightforward for the information specialist. But a considered approach is still needed: following point one (above) the research team may want to dip into several different areas of research, so it’s important retain the focus on specificity to stop the amount of literature accumulating too much.
Continuing on this point, Becky has noted that “there will always be further evidence that would be brought to bear, to further build or refine our thinking…” This raises the issue of whether “when we ‘stop’ searching, are we making a judgement which is based more on external factors to the project (time, funding), rather than the internal factors (we’ve found it all)?” I’m not sure what the answer to this question is, except that the process of stopping will be a conversation between the information specialist and the rest of the research team: if the research team say ‘stop’, the information specialist can suggest reasons why it might be worth continuing. For example, there might be a different database that could be utilised, or search terms could be refined. Eventually, (hopefully…) an agreement will be reached.
Thirdly, literature searching is likely to take place throughout the review process. Traditionally, an information specialist will aim to identify all the required literature usinga single search strategy at the start of the review process. This ensures transparency and enables other researchers to reproduce the same results. It also prevents the research team from biasing the results by targeting pockets of evidence. By contrast, the evidence base for a realist review will develop incrementally as mechanisms are uncovered and links are made with other areas of research (see point 1). An implication of this is that researchers can decide at any point to run additional searches. As such, an information specialist should be prepared for a higher level of involvement than for a traditional systematic review.
(It’s also important to note that the information specialist should still aim for transparency by recording the searches, as recommended in the RAMESES publication guidelines for realist reviews).
Ray Pawson, perhaps the guru of realist reviews, has written a little about literature searching here. Much of what I’ve written is loosely based on his work, so it’s worth looking at the section titled “Searching for relevant evidence” for further guidance.
The following is the first in an occasional series of blogs written by colleagues working with realist methods outside the University of Exeter. Mark and I are very grateful to Kevin Harris for his contribution. Kevin is a Senior Lecturer at the Faculty of Business, Sport and Enterprise, Southampton Solent University, and his post is a closer look at the evaluation of The Coaching Innovation Programme…. over you to Kevin.
My name is Kevin Harris and I am a senior lecturer and course leader in sport development and sport policy at Southampton Solent University.
When deciding to take my PhD I was keen to do something that kept me connected with the industry I used to work in (sport for social change) and create a closer bridge between academia and industry.
Around this time I had just created the Coaching Innovation Programme which in essence mobilises student led sport, physical activity and coaching projects to residents in the community of Southampton to address community needs. For example my students have been involved with researching the needs of a community and delivering their own projects – things like combining physical literacy with maths and science in the curriculum to promote learning, all the way through to delivering sports based sessions to offer resources for homeless people. The students work with industry practitioners to address niche areas and respond to those needs.
The Coaching innovation Programme now mobilises around 30-40 of these projects every year now so this is a massive contribution to the sport development and physical activity landscape. One of the things which required and still required addressing was the unanswered questions surrounding evidence of the projects. The students would be able to refer to satisfying experiences for them, their participants and practitioners yet they would struggle to evidence the impact of their project, but even more importantly how and why their project achieved its outcomes.
This is where I saw a fantastic opportunity to apply my PhD to the CIP and come up with a monitoring and evaluation framework which would enable the students and practitioners to make sense of what they learnt from their programme. This would also address some of the issues in my field that surrounds practitioners engaging with monitoring and evaluation (M and E) which is relatively poorly carried out (Coalter, 2007). In essence I wanted to bring M and E practice closer to the practitioners and more embedded in their work.
So, then… Over the last two years I have spent a considerable amount of time reviewing certain approaches to evidence, M and E. This has been and felt like a round the world trip in itself. There is an ocean of literature and approaches out there. Given the nature of the complexity of the interventions my students are implementing it was no surprise like many of us that I found myself exploring the philosophical roots of critical realism and the emphasis of programme theory. This was after tinkering with other aspects of programme theory such as logic models whose operational logic failed to really capture what we were trying to do.
The realistic angle of programme theory and evaluation really started to take hold as it really fitted with the nature of the interventions around producing multiple outcomes for different people in different contexts and firing mechanisms. around conceptual logic. This led me to produce two models focusing on the formation of programme theory and monitoring and evaluation. The first which would take students through the steps of developing their own candidate programme theory, borrowing the principles of Pawson and Tilley’s (1997)s realist approach and combining with other aspects of operational logic by anatomising the programme Funnel and Rogers (2011). By this I simply mean outlining and breaking down the programme strategy into its components – eg activities, inputs, outputs. Made up of three stages the first model would firstly map the field and establish the context whereby students would carry out a situational analysis such as looking at the geography of the area, the needs of the participants and contact with stakeholders. This would then inform stage 2 which would enable the students and practitioners / additional stakeholders to anatomise their programme (light touch logic model) and establish the key outcomes / subsidiary theories which constitute their project.
These would usually constitute ‘if’ and ‘then’ assumptions which would lead to stage 3 which goes on to explain how and why those outcomes may come about. It is this stage I suppose that really captures the realistic lens of conjecturing CMO configurations getting to heart of explaining how and why and for whom the outcomes might work. By the end of these stages the students and practitioners have themselves a robust and rigorously constructed candidate programme theory.
Of course the next step is to then test the theory through project delivery and M and E. This is where model 2 comes in which takes the students through 6 key stages of programme evaluation within a light touch realistic approach. For example , part 1 consists of reconceptualising and refreshing programme theory, part 2 consists of developing and framing evaluation questions within the realistic lens (eg what works for whom in what circumstances and why). These questions are particularly constructed against each CMO conjecture but NOT all of them. Like Pawson 2012 states, steady your fire! Part 3 and 4 involves establishing methodological competency and agreement of methods to answer questions and part 5 covers data analysis with 6 covering the reporting of data. These 6 parts have been produced using participatory evaluation approach which has involved the students throughout via cooperative enquiry and training / facilitation (Greenwood and Levin, 2007; Fetterman, 2005) from myself. The aim was to train / facilitate the student practitioners to be able to carry out realistic techniques for their M and E. Thus, I am testing the model.
At this stage I have just completed (nearly) my first pilot of the model by working with 6 student projects. The workshops have been delivered and supported by action learning sets engaging in discussion with the students and progressing their M and E. My aim is to reach Mphil transfer this summer by exploring the utility of the model(s), the extent of students engagement and praxis in M and E.
Firstly , teaching and stimulating interest in the area of M and E is hard, especially for young practitioners! This is particularly hard given the language used with Realist .Evalutation and in many respects students simply do not get it. The academic discourse in which it resides presents a challenge for unlocking its potential for people working on the ground. The nature of the projects themselves and the time it takes to employ a realistic evaluation has also been challenging for the students. For example, how do you uncover the generative mechanisms for change of 9 year old children? All this in addition to the many other priorities of university work load and life for the students I have been working with.
In addition, the conceptual obstacles that realsitic evaluation presents is also a major challenge. Mid range theory, Demi regularities, conjectures, mechanisms and theory riddle the literature on this. This creates major obstacles for practitioners.I don’t think that it’s the different way of seeing’ that realist approaches advocate (eg why things work) that cause the problem. It is more about the language and understanding of how to identify mechanisms of change. I initially wondered when developing the model whether the conceptual nature of the realistic approach would be suitable for practitioners. My initial thoughts were that it should given that such an insightful method should not be constrained to academic discourses. Having posted this in the RAMESES mailing list, thankfully the guru herself Gill Westhorp stated that it’s entirely appropriate for practitioners to be introduced to the approach. Why shouldn’t practitioners engage with such techniques?
The key Gill said was to communicate it in a way that does not confuse the language and can meet the contextual needs of practitioners and how they engage. This is something I have tried to follow. By far, the hardest thing to grasp is the ever elusive programme mechanism. Having attendee the realist training workshop in Liverpool this March Justin Jagosh did a great job in explaining ways to identify a mechanism. In that programme activities within out candidate theory provide resources and opportunities and those resources and opportunities produce reasoning in the minds of the programme users. The key is identifying what resources are they … what opportunities are they and how might the participants respond to them. These are the ingredients which then produce the mechanism for certain people (for whom).
I am really keen to benchmark with people on this. I really value the realsitic approach yet promoting it within a simplistic and pragmatic way for students and practitioners is a key area for discussion.