How quickly can you read an Impact Case Study?

by Michael Wykes

In RAE 2008 some 1,851 individuals were returned to the English sub-panel by 87 Higher Education Institutions. Assuming that Universities’ REF 2014 submissions intentions were accurate last December (which is a big assumption!), there will be at least the same number of Category A staff, if not 3.6% more in 2014. Broadly speaking, therefore, for English this would generate about 190 Impact case studies on a 1 per 10 FTE basis and 87 Impact Templates – as a rough guide, although this is seriously misleading since the number of Case Studies would probably be nearer to 275 since 35 HEIs submitted fewer than 15 FTE in 2008, so would need 2 Impact Case Studies, 24 would need 3, 17 four, 7 five, 2 six, 1 eight and 1 ten. 

So how long will panels actually take to discuss each case study and arrive at a grade? In their call for volunteers, HEFCE stated that sub-panels would meet about 8 times, staying overnight for several days. Users were told to anticipate just 20-40 ‘short’ case studies for meetings lasting 3-4 days. Taken at their word, and if each meeting for sub-panel members ran for 3 days, this could result in a total of about 24 days of meetings. For bigger panels, like English, there will obviously be more meetings for longer, and HEFCE have of course modelled the likely volume of work in order to calculate the number of meeting days required. Although it should be noted that the precise way in which the academic and user members on the panel will read and grade Impact Case studies is not yet confirmed. 

But how long will they actually take, or perhaps how long would you think it was reasonable or indeed necessary to take to review a Case Study? And what about that all important but elusive Impact Template bearing in mind the fact that these three or four pages of text are worth the highest amount of funding per word in the entire assessment – equivalent to eight monographs for some departments? And surely all members of the panel – academic and user alike – should read this document? 

You might reasonably allocate, say, 15 minutes say to review each case study, discuss its merits and agree on a grade? With 275 Case Studies and 87 Templates for English, that would take about 90 hours, or 12 days – nearly half of the allocated time for the meetings, but for only 20% of the overall funding, and before the panel has reviewed some 5,500 outputs. 

The evidence would therefore suggest that whilst sub-panel members and users will of course do their homework before the meetings, they will probably have no more than 5 minutes to discuss and agree on these all important grades. Those 100-word summary sections had better be convincing.

Occupy Impact Conference, An International Perspective

By Teresa Penfield

Over the past few weeks, the DESCRIBE team have been collating international perspectives of research impact.  As part of this endeavour we took part in the CASRAI Occupy Impact Conference, held in Montreal, where we participated in discussions and undertook a session “Describing Impact in the UK”. One of the aims of CASRAI is to generate a data dictionary of terms, with internationally agreed standards, a common language with which impact (amongst other things) can be described.  The conference was focused on what to develop for the next version of the data dictionary.  There were discussions on what should be measured, the balance between numbers and narratives, attribution and discipline dependent impact and capturing impact through the use of systems and technology.

So what does impact mean from an international perspective? One of the big differences between evaluating institutional research impact in the UK is that the term ‘impact’ is now only used for changes induced outside of academia. Impact internationally tends to still incorporate academic impacts to give an overall picture with perhaps a greater weighting on progressing the capture of academic benefits, in part due to having more experience in this area.

Contributing to discussions at the CASRAI conference was amongst others, Claire Donovan, Jack Spaapen, Ben Martin and Cameron Neylon. The conference highlighted that similar challenges are faced internationally for determining what impact is and the indicators that can be used to evidence it.

Key points from a DESCRIBE perspective to come out of the conference were:

  • Understanding the networks between stakeholders, researchers and institutions is critical to understanding impact.  Capturing these networks would be a big task which can only be addressed using computational methods but which would provide a useful tool for understanding and describing impact.
  • It is sometimes useful to describe the route to impact in a linear fashion to enhance stakeholder understanding and to prevent getting bogged down in details.  There is however, no linear process from research to impact and any model used to capture information should be flexible.
  • Care is required when categorising impact that we don’t lose the full picture by dissecting impact into small easy to measure boxes.
  • Perspective is required for assessing impact, answering specific questions from a particular stakeholder perspective.  Trying to evaluate impact using a one size fits all method will be imperfect.

There was a lot of interest from the international community in impact as part of the REF and the step change in information capture that would be required to evidence impact for the REF. The value of the 5000 plus case studies, which will be generated in the UK for the REF, was seen as a hugely valuable resource to ‘mine’ for impact, indicators and evidence.

Media, Public Engagement, Impact and the REF

By Matt Baker  27th Sept 2012

With the final REF Panel Criteria and Working Methods publication by HEFCE, ‘media’ features in some way across all panels. The media and its appropriateness as evidence is panel specific with subtleties and nuances between ‘critical review’, ‘independent citation’ and ‘coverage’ depending on the type of impact.  Panels B (physical sciences), C (social sciences) and D (humanities) giving much more attention to detail than panel A (life sciences). We might question whether this is an attempt at a strategic leveller between the Humanities and the social sciences to raise the bar against the perceived high impacts associated with the life and physical sciences. This of course may simply be a true representation of the different disciplines approach to impact?

The same can be said for public engagement, where panels B, C dedicate more or less identical sections to public engagement which include archetypal evidence, including follow up activity or ‘media coverage’, ‘information about the types of audience’ with panel D broadly following suit. Panel A however, give little attention to engagement and the information provided is less helpful giving few examples of evidence or indicators arising from engagement activity. From the outset this appears to be life scientists taking a much harder line with impacts arising from public engagement and evidence provided by the media. There may of course be a much more benign motivation to these types of impact which has not been correctly reflected in the REF guidance. We must also remember the caveat to the framework provided by the panels that ‘this list is not exhaustive or exclusive list’ the question is, how much should we read between the lines?

Is there such a thing as ‘negative’ impact?

By Teresa Penfield 24th Aug 2012

One of the questions that we would like to explore as part of the DESCRIBE project is, ‘how do we handle negative impact’?  Is there such a thing as such a thing as negative impact?

One could break negative impact down into two separate and very different concepts.

  • Research which has a negative impact on an unwanted change or potential development.
  • Research which results in undesirable changes or negative impacts…

It is this latter issue which I want to think about here.  When looking for impact we tend to focus on the benefits which have taken place as a result of our research but impact is not always beneficial.  Should this be taken into account when assessing the impacts of our research?  Should we think of impact in a positive or negative light or should impact be value free?

In many instances, determining the positive or negative nature of impact can be subjective and we can also deliberate on how our research may have double effects.  An example might be with the technological advances in social networking impacting upon our culture.  There are inarguably positive impacts on networking, keeping informed, up to date and in touch whilst also having potentially negative impacts on privacy, loss of employee productivity and even negative health effects.

Perspective can play a huge part in assessing impact but shouldn’t impact assessments provide the full picture?  Is it acceptable to include an element of ‘spin’ and be selective over the impacts that we describe?  This is a particularly interesting consideration when reviewing the case study approach for assessing impact.

In the HEFCE Assessment Framework and Guidance on Submissions (2011) for the REF impact is broadly defined as,

“an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”.

While this implies that impacts do not have to be beneficial, it would be a brave move to test this and allow a panel to judge the reach and significance of changes which impacted negatively upon our society or economy.

If we want to truly understand the impact of our research do we need to be more comprehensive in detailing both the perceived positive and negative changes that result? Or is a view of impact through rose-tinted glasses all that we are after?

Gathering Expert Opinion

By Teresa Penfield 2nd Aug 2012

One of the main objectives of the DESCRIBE project is to develop an understanding of what impact means to different groups of people and the specific challenges faced in defining and evidencing impact across a broad range of research fields.

As part of our data gathering we are conducting semi-structured interviews with leading impact experts, professionals and academics both in the UK and internationally.  We aim to interview around 50 individuals to develop an in depth understanding of impact among these groups of people.

To date we have conducted approximately 10 interviews and gathered opinions on what is understood by impact, views on the current methods used to assess impact, how impact is being measured and evidenced, the challenges groups face developing traceable evidence of impact and both concerns and support for incorporating impact into University research assessments.  Some interesting discussion points have been raised which we will investigate and develop through the course of the project but a few examples are listed below.

  • Will there be a devaluation of knowledge exchange and public engagement as a result of the “impact agenda”?
  • Funding excellence will result in impacts but by funding impact, will excellence be compromised?
  • Are universities becoming the R&D arm of industry?

As part of our data gathering we will also be developing an on-line survey to gather opinion from a wider group of individuals.  This survey will be sent out in the autumn, if you would be interested in being part of this, please email the DESCRIBE team and we will add you to our list.

Systems events

Teresa Penfield 31 July 2012

Over the past two weeks members of the DESCRIBE team have been involved with two systems focused events.  Summaries of these events can be found below and the associated links…

The RIM CERIF workshop held in Bristol on 27-28 June brought together members of research councils, funders, JISC, RIM and MRD project and program managers to discuss topics such as institutional repositories, identifiers and RCUK ROS and hear updates on projects including Gateway to Research, UKRISS, the CERIF Support Project and Project Snowball.

Of particular interest were discussions around vocabularies and the ways in which research councils such as NERC and MRC are currently recording impact.  While technical aspects of systems development was beyond the scope of the DESCRIBE project, gaining a broader understanding of what systems developers require and how describe outputs may be utilised in the future was very valuable.

The RMAS launch took place at SOAS in London on 10 July, showcasing the recently developed Research Management and Information System which enables integration of off the shelf products for information management within academic institutions.

The day was introduced by Steve Butcher (HEFCE), followed by a keynote Address by David Allen OBE (Registrar and Deputy Chief Executive, University of Exeter) and went on to provide an overview of the RMAS system from the three pathfinders, University of Kent, University of Exeter and University of Sunderland.

At this launch the DESCRIBE project took part in a poster session to engage with participants, who came from across 80 UK universities.  Discussions on the challenges faced in describing impact by institutions are enormously valuable to us in the early stages of our project.

Project initiation – June 29

By Teresa Penfield

Since the June 1 all of the DESCRIBE project team have been in place and getting the project up and running.

Our first steering group (SG) meeting was held on June 20 at the Arnolfini in Bristol. The aim of this meeting was to ensure that the scope and direction of the project was well defined and met the needs of our stakeholders.

Organisations involved in the project include RCUK, JISC, AURIL, Brunel University, HEFCE, STARMETRICS, CASRAI, CIHE as well as the individual UK Research Councils.

An overview of the project was presented to the group with details of our interviewee list and the project plan.

Specific recommendations to come out of the meeting included:

  • DESCRIBE should provide a comprehensive and balanced guide to evidencing impact.
  • DESCRIBE needs to encompass views from an UK and international perspective.
  • The project needs to explore the views and contexts of the various impact user groups.
  • Interviewing experts in the field will be critical to information exchange and needs to ensure it encompasses SMEs.
  • As a project we should aim to hold a fact-finding workshop before the end of 2012 and follow this up with a dissemination conference day during the spring of 2013.