Hello and thank you for reading my blog post. My name is Simon Briscoe and I work as an information specialist alongside Mark and Becky at PenTAG. When I tell people that I’m an information specialist I tend to get a blank look. So I usually say either that I work in health research, or that I’m a librarian who works with databases. Put the two together and you get the gist of it, which is that I search databases for literature which is then used by health researchers to write reports.
This year I’ve been working on a realist review with Mark. The review is part of a project that aims to develop a collaborative care intervention for prisoners with mental health problems, near to and after release. This was my first realist review, so I spent some time familiarising myself with the methodology required for this type of work. In doing this, it became apparent that my role as information specialist would be different to other reviews that I’ve worked on. It was fun learning a new method, and Becky thought it would be useful for me to write a blog post to share my experiences.
Most of the time, the role of the information specialist in health research is well-defined. When a research team are put together to write a report, an experienced information specialist will have a clear idea of what’s required of them: there are guidelines detailing each part of the process, from identifying search terms and selecting databases to search, to recording the results of the searches. Most reports that I contribute to are systematic reviews and require a thorough appraisal of all (or almost all) the literature on a topic in order to reach an evidence-based answer to a question.
Realist reviews are not premised on the idea that the right answer can be reached by simply assessing all the available evidence. A bit like Benedict Cumberbatch’s Sherlock Holmes, realism refuses to settle for what first appears to be the case. Becky has written about this in her blog, and in particular, she’s highlighted a couple of points which I think have an impact on the role of the information specialist: “making sense of the evidence” and “finding ‘just enough’ evidence”.
Firstly, in this post, Becky writes:
“[Realism] is not just summarising the evidence, it is making sense of it, maybe within a bigger scheme, in a way that has potential to be more applicable to decision-makers” [my emphasis].
“Summarising the evidence” is sufficient to answer some questions, particularly, where the intervention is simple to administer and the effects can be easily measured. (For example, the use of aspirin for the reduction of stroke). But some interventions are complex and the results are harder to measure. This is where realist researchers argue we need to “make sense” of the evidence, which involves identifying principles (or mechanisms, in realist language) that lie behind and explain the effectiveness of an intervention.
A second key point is that rather than assessing all the available evidence, realism seeks to
“…achieve theoretical saturation, so that we can be pretty confident that there is ‘no more’ important evidence to capture for the particular theory we are building or testing…” [my emphasis].
Realism does not seek to find all the evidence, but just enough (i.e. “theoretical saturation”) to answer the question. “Theoretical saturation” is reached when there is “no more important evidence”, which in the context of realism means, no more mechanisms to uncover in the evidence.
It was my experience that these two elementary principles of realist reviews affected my role as an information specialist in at least three ways:
First, an information specialist should be aware that relevant mechanisms are potentially identifiable in literature outside the scope of the review. Mechanisms are to some extent transferable between different population groups and interventions, so it’s a good tip to broaden the scope of the search. For example, the population group for the realist review I worked on was prisoners near to and after release. However, we also searched for studies on social groups with similar vulnerabilities to the prison population (e.g. people who use illicit substances), which we thought might reveal mechanisms that apply to the prison population, too.
Secondly, because the aim is to achieve theoretical saturation rather than comprehensive coverage of evidence, an information specialist should focus on specificity rather than sensitivity. Specificity and sensitivity basically mean accuracy and breadth of coverage, respectively. Ordinarily, the information specialist balances both. But because mechanisms recur in different pieces of research, the researcher is likely to become familiar with all the mechanisms before they have exhausted all the literature.
In this respect, a realist review is more straightforward for the information specialist. But a considered approach is still needed: following point one (above) the research team may want to dip into several different areas of research, so it’s important retain the focus on specificity to stop the amount of literature accumulating too much.
Continuing on this point, Becky has noted that “there will always be further evidence that would be brought to bear, to further build or refine our thinking…” This raises the issue of whether “when we ‘stop’ searching, are we making a judgement which is based more on external factors to the project (time, funding), rather than the internal factors (we’ve found it all)?” I’m not sure what the answer to this question is, except that the process of stopping will be a conversation between the information specialist and the rest of the research team: if the research team say ‘stop’, the information specialist can suggest reasons why it might be worth continuing. For example, there might be a different database that could be utilised, or search terms could be refined. Eventually, (hopefully…) an agreement will be reached.
Thirdly, literature searching is likely to take place throughout the review process. Traditionally, an information specialist will aim to identify all the required literature usinga single search strategy at the start of the review process. This ensures transparency and enables other researchers to reproduce the same results. It also prevents the research team from biasing the results by targeting pockets of evidence. By contrast, the evidence base for a realist review will develop incrementally as mechanisms are uncovered and links are made with other areas of research (see point 1). An implication of this is that researchers can decide at any point to run additional searches. As such, an information specialist should be prepared for a higher level of involvement than for a traditional systematic review.
(It’s also important to note that the information specialist should still aim for transparency by recording the searches, as recommended in the RAMESES publication guidelines for realist reviews).
Ray Pawson, perhaps the guru of realist reviews, has written a little about literature searching here. Much of what I’ve written is loosely based on his work, so it’s worth looking at the section titled “Searching for relevant evidence” for further guidance.
Link to PenSR PowerPoint slides http://medicine.exeter.ac.uk/pentag/workstreams/pensr/