This Working Group (WG) documented how UK universities are currently collecting impact information in their various local software solutions, including commercial and in-house systems and spreadsheets. The WG also documented the various impact reporting information requirements of the UK funders, including REF2014 (the next REF likely to be reasonably similar) and the Researchfish tool. Once we have community input on output of this work (see report linked below) the next cycle of work will be to recommend how to move to standardise/harmonise the divergent information requirements.
See this post for background on the Open Review. This review is open until June 30, 2017. The draft report is NOW READY for review below.
To comment on the content below either click the blue “Reply” button at the bottom or select a passage of text in the report and click the “Quote” pop-up to create a comment about that section only.
The impact of research is an increasingly important agenda for universities with funders keen to speed up the translation of funded research into measurable effects on the the economic, cultural and social lives of people and organisations, as well as their health and well-being.
This applies particularly to public funders who need to justify their expenditure to their masters in government and society at large, particularly where budgets are coming under increasing scrutiny. REF2014 (http://www.ref.ac.uk/) introduced Impact Case Study Reporting as an element of the UK national research assessment exercise with the outcome directly affecting funding, and was hailed a success by both the government funders and, broadly, by the research organisations involved.
This has galvanised the acceptance of impact as a recognised area of research strategy, policy and management at universities, which has, in turn, led to a call by these universities, represented by this CASRAI-UK working group, to improve the efficiency and effectiveness of impact reporting tools.
The first phase of work by the group has been to survey and document the current systems in use by UK universities for recording impact information. This is described in more detail below including a proposal for a set of information elements to record impact which we are making available for feedback from the wider research community, and a specific set of questions to help target this feedback.
The Working Group (WG) documented how UK Universities currently record impact information in their various local systems. These systems include: Pure, Converis, Elements, EPrints, Researchfish, VV Impact Tracker, in-house systems and individual spreadsheets. The WG also documented the various Impact requirements against which Universities are asked to provide data.
The WG used the above information to propose an initial list of information elements for recording impact and draft definitions for these elements, plus items requiring further consideration (planned for the continued work in 2017).
The WG welcomes feedback on this draft from all stakeholders including other research organisations, research funders and system suppliers. With the recommendation in the Stern review for funders to harmonise their requirements for Impact and the evidence from this WG of a convergence across systems currently in use for recording impact, this seems an ideal opportunity to work collaboratively on a clear research information template for Impact. The plan is for the WG, including new members who wish to be involved, to reconvene following the review process to produce an Impact Glossary for inclusion in the CASRAI dictionary.
2) Early Draft Glossary of Impact Elements
The following list includes the terms (and any available definitions) being used in our various impact capture tools. In its next work phase the WG will refine this list, supply missing definitions and draft a proposed harmonised data collection template that can meet all the above purposes.
Elements currently agreed for inclusion
- Title - Short title for the impact.
- Description - A summary, single paragraph, description of the impact.
- Narrative - A detailed narrative of the impact. A textual/human readable form of the details of the activities which have been undertaken to create the impact and the combined result. This should be able to be read independently of other parts of the impact record. Include references to relevant periods and future developments.
- Who is Affected? - Details of the audiences, beneficiaries and end-users.
- Category of Impact - The area in which the change or effect is realised. (WG agreed to default to existing REF list of single broad categories including wider proposed definition so adding academia/learning and teaching.)
- Impact Stage - The degree to which the impact has developed, been adopted and been realised. Potential candidates for a standard ‘list of stages’: Planning or developing - early stage; Involvement - mid or active stage; Change adopted - end stage; Public benefited - end stage; In progress; Impact occurred - awaiting evidence; Impact occurred - evidence obtained; Planning and early development; Involvement and engagement; Influence and change (awaiting evidence); Influence and change (evidence obtained)
- Associated IDs - Identifier/s for associated impact in an external system. E.g. REF2014 or Researchfish
- Impact Evidence - Demonstrable proof of the effect or change. Multiple examples of Impact Evidence can be associated with an Impact. A possible structure for each instance of Impact Evidence: Type e.g. quantitative or qualitative; Period; Evidence Title; Evidence Summary; Evidence Contact Info; Links - urls, documents
- Participants - The people and/or organisations who contribute to the Impact. This needs to be structured further to include, for example, persistent identifier and role/contribution. The CASRAI CRediT taxonomy could be applied to this - or extended to cover additional categories for Impact contribution.
- Restrictions - Any restrictions on impact (and associated evidence) visibility; where information may need to be withheld from public domain due to data protection, commercial, political or other sensitivity.
- Geographical extent of the influence - Geographical reach of the impact: local, regional, national, international or multinational.
Elements parked or excluded (but open for reconsideration)
- Related Content - Agreed to leave these out of the core elements for now - although likely to be required as link to the underlying research - particularly Links to Research outputs and Projects. E.g. Press/Media; Equipment; Research output/publications; Activities; Projects; Datasets; Prizes; Student thesis
- Impact Hierarchy - This can provide a hierarchy or sequence of Impact i.e. this impact built on top of this other impact. How then do we determine where one impact ends and another starts.
- Period - Agreed: to take this out altogether because the elements would be referenced in the narrative. Date of engagement - referenced by linked activities; date impact evidenced - referenced by linked evidence. More discussion would be needed.
- Future Developments - What further impacts are possible/likely because this change has happened. What intend to do in the future to extend the impact. Suggest : include this in Narrative element above.
- Impact Classification - One or more ways of classifying Impact. No agreement on whether to keep this or not - similarly for Keywords.
- Keywords - See classification
- Links - urls, documents, etc - Agreed to park this for now.
- Reach - Extent to which the research has impacted on the potential beneficiaries of the research
- park for now - could be included in Narrative?
- Significance - Explanation of the importance of the impact(s) that have occurred as a result of the research. Park for now - could be included in Narrative?
Specific questions on the above information elements:
- Are separate Description and Narrative elements necessary? If so how would you define each clearly and distinctly?
- Is the proposal to record Impact Stage sensible/feasible? How many Impact Stages should we have? How is each one defined?
- Do you agree with categorising impact and the WG suggestion to use the REF definitions (including plan to extend to academia learning/teaching?
- Should any of the related content elements be part of the core definition? If so, which ones and why.
- Which, if any, of the Parked elements should be included in the core definition? Please say why and provide a definition, and where we have suggested they could be included in the Narrative - please indicate why they should be separate items.
Appendix) Impact Data Collection in the UK
Higher Education Institutions collect and report on impact activities for a number of purposes, in a number of ways, and using a range of metrics. These are listed below.
1) Research Excellence Framework (REF)
- The Research Excellence Framework (REF) is the system used to assess the quality of research in UK higher education institutions. The REF is undertaken every five-seven years, with the last REF results published in 2014 (REF2014) and the next due to be published in 2021 (REF2021).
- Peer review is at the heart of the REF assessment, covering research outputs, research environment and impact. Impact assessment accounted for 20% of the REF2014 score.
- Impact assessment was based on an Impact Template (20% of the impact score), which described the Unit of Assessment (discipline) approach to impact overall; and a selection of Impact Case Studies, dependent on number of staff FTE submitted (80% of the impact score).
- Impact in REF2014 was defined as ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’.
- Impact Case Studies required evidence of both the link between research in the discipline and the impact created; and evidence of the impact itself. Impact was measured in terms of reach and significance. There is an expectation that the next REF - probably REF2021 - will have a similar requirement.
- Since the introduction of impact to the REF, various systems have been developed for curation of impact evidence – including impact modules for CRIS systems (such as Pure and Symplectic), Vertigo Ventures Impact Tracker, Researchfish, and a range of bespoke internally developed systems and approaches.
- Many institutions will run REF mock exercises on a regular basis throughout the REF cycle, collating impact narrative statements or plans.
2) Applying to Funders
- Many funders require a Pathway to Impact or similar statements around how the project will engage with end users. The definition of the “Pathway to Impact” is evolving with the RCUK
- Current RCUK definition: “The demonstrable contribution that excellent research makes to society and the economy.” But the REF consultation proposed that HEFCE (http://blog.hefce.ac.uk/2017/02/09/research-is-all-about-impact-2/) and RCUK (http://www.rcuk.ac.uk/innovation/impacts/) align their definitions of impact to be adopted for REF 2021. Proposed aligned definition: “Wider impact: an effect on, change to or benefit to the economy, society, culture, public policy or services, health, the environment, or quality of life, beyond academia.”
- H2020 (https://www.ffg.at/sites/default/files/downloads/page/horizon2020indicators.pdf) talks about different categories of impact as indicators and includes “impact broadly defines the wider societal, economic or environmental cumulative changes over a longer period of time” and “impact indicators represent what the successful outcome should be in terms of impact on the economy/society beyond those directly affected by the intervention”
3) Reporting to Funders
- Researchfish: RCUK and some other funders require collection of impact information through the Researchfish (https://www.researchfish.net/) software. Researchfish collects:
- Engagement activities
- IP and commercialisation
- Impact statement
- Other reporting activity occurs on non-standard RCUK projects (e.g. ESRC Impact Acceleration Accounts http://www.esrc.ac.uk/funding/funding-opportunities/impact-acceleration-accounts/) and for other funders. These may require collation of impact generation associated with the project. This can be extensive: for example the EPSRC Impact Acceleration Accounts has over 50 KPIs, falling into Outputs/Outcomes/Impact categories (https://www.epsrc.ac.uk/newsevents/news/impactaccelerationaccounts/).
4) Impact Prizes
- There are a number of impact prizes where applicants enter the impact that they have created, potentially in a number of categories. Examples:
- Internal prizes/awards
5) Higher Education and Business and Community Interaction Survey (HEBCIS)
- The annual HE-BCI survey (https://www.hesa.ac.uk/support/definitions/hebci) examines the exchange of knowledge between universities and the wider world, and informs the strategic direction of ‘knowledge exchange’ activity that funding bodies and higher education institutions (HEIs) in the UK undertake.
- This is an annual return and drives the allocation of Higher Education and Innovation Funding (HEIF - http://www.hefce.ac.uk/ke/heif/) to universities.
- HEBCIS collects a range of information, though these are indicators of impact generating impact, and not necessarily evidence of impact. These include:
- Collaborative research
- Contract research
- CPD and CE
- Facilities and equipment related services
- Intellectual property income
- Regeneration and development programmes
- Spin-off companies established
- Spin-off companies active for three or more years
- Number of new patent applications filed/granted
6) Key Performance Indicators (KPIs)
- Aspects of impact activity may be measured as institutional KPIs, including volume/quality of REF case studies, industrial income, numbers of partnerships, etc.
7) Internal Case for Investment
- Demonstration of return on investment may be required to secure ongoing investment for impact both within academic departments and Professional Services.
- Linked to KPIs possibly.
8) Resource planning
- Institutions may wish to understand the volume of impact activity being carried out and likely resource requirements in order to plan support appropriately, including distribution of resources (human/financial).
9) Case Studies (not REF)
- Case studies showing impact stories may be requested from funders and internally to demonstrate impact in particular areas, for marketing activity, etc.
- Examples of impact may be presented on institutional websites, on newsletters and other marketing material to demonstrate to end users the potential value of engaging with universities.
11) Sharing of Best Practice
Nationally and internally, activity around what is best practice in impact.
- Performance Review
Increasingly institutions are using impact generation as a criterion in performance review and promotion.
13) Customer Relationship Management Systems (CRMs)
- Institutions may use CRMs to collect information on networks between the institution (individuals/departments/Colleges/institution) and external customers/partners.
- This may enable identification of potential partners for both bidding and sharing of research to create impact; and for targeting marketing activity.
- Possible link to KPIs.