Submitted 12th March 2017
This is the response by the Media, Communication and Cultural Studies Association (MeCCSA) to the Consultation on the second Research Excellence Framework. The consultation sets out proposals on how to implement the next Research Excellence Framework, building on the REF 2014 and incorporating the principles identified in Lord Stern’s Independent Review of the REF. The full response is below and also available as a PDF.
Page 1: Respondent details
Q1. Please indicate who you are responding on behalf of
Subject association or learned society
Please provide the name of your organisation
Media, Communication and Cultural Studies Association (MeCCSA)
Page 2: Overall approach
Q2. 1. Do you have any comments on the proposal to maintain an overall continuity of approach with REF 2014, as outlined in paragraphs 10 and 23?
Subject to comments on particular proposals, as below, we broadly welcome continuity, as minimising the disruption to institutional planning. In particular we strongly endorse the continuation of peer review at the heart of the assessment process, and the decision, rightly, not to move to a superficially less costly but fundamentally flawed bibliometric approach.
Page 3: Unit of assessment structure
Q3. 2. What comments do you have about the unit of assessment structure in REF 2021?
This is a major concern for researchers in our field. Culture, communication and media studies was a discrete area in both REF and RAE having been many years ago separated off from Library and Information Science as a result of its growth and their very different intellectual and academic activities and cultures. The point was made firmly in the published subject overview report for 2014, which notes that “having argued strenuously many years ago for the separation of fields whose common attachment to terms like communications and information masks substantive differences of intellectual origins, approach, and interest (such that in many HEIs they would be found not just in different departments but in different faculties or schools), the consolidation of these fields into a single UOA poses continuing difficulties that may need revisiting for a future exercise, not least as both broad fields are thriving and extensive”. We recognise the desire not to increase the numbers of UoA’s, but this particular anomaly must be addressed. The Association is also of the view that there is great uncertainty about the proper place for such areas as screen or film studies, which in 2014 found similar outputs submitted to different panels, and the problem partly solved by two sub-panels having assessors in common. Our view is that both these problems would be solved by incorporating film and screen studies into the (possibly re-titled) sub-panel dealing with media and communications, and relocating library and information science elsewhere. The reconstituted panel should have appropriate expertise to deal with these areas, and also with practice-led research in its areas of interest.
Page 4: Expert panels
Q4. 3a. Do you agree that the submissions guidance and panel criteria should be developed simultaneously?
Yes
Comments: We feel that they need to be in tandem and the sector needs to know their detail as soon as possible.
Q5. 3b. Do you support the later appointment of sub-panel members, near to the start of the assessment year?
No
Comments: It would be better if this was completed sooner rather than later; they need to be in place and meeting before submissions are finalised
Q6. 4. Do you agree with the proposed measures outlined at paragraph 35 for improving representativeness on the panels?
Yes
Comments: We support all the measures recommended in para 35.
Q7. 5a. Based on the options described at paragraphs 36 to 38 what approach do you think should be taken to nominating panel members?
We feel nominations from subject associations and the like, but not from individual HEI’s, should continue, and that it is appropriate for demographic data to be both provided and collected.
Q8. 5b. Do you agree with the proposal to require nominating bodies to provide equality and diversity information?
Yes
Comments: We wish to see nominating bodies enabled to nominate not just individuals, but sets of names so that they can propose sub-panel memberships properly reflecting E&D issues as well as the full range of expertise to reflect the likely range of work to be assessed. We agree that the possible procedures outlined in para 38 would be unwieldy and are largely unnecessary if other recommendations are followed.
Q9. 6. Please comment on any additions or amendments to the list of nominating bodies, provided alongside the consultation document.
No comment
Page 5: Staff
Q10. 7. Do you have any comments on the proposal to use HESA cost centres to map research-active staff to UOAs and are there any alternative approaches that should be considered?
This proposal has caused considerable alarm and bewilderment. Whatever its possible efficiency gains it would massively distort the exercise. HESA categories are teaching driven and frequently, perhaps even predominantly, especially in interdisciplinary areas, map very imperfectly onto the organisation of research. It is vital that the REF retains the principle of allowing HEI’s to submit research in the form and structure best reflecting their own practice and organisation. To do otherwise could cause considerable difficulty to research management and damage the quality of interdisciplinary research irreversibly.
Q11. 8. What comments do you have on the proposed definition of ‘research-active’ staff described in paragraph 43?
We welcome the proposal to require all staff on teaching and research contracts to be submitted, and believe this does not require using HESA cost centres to structure the assessment panels. We are against any measure that encourages or facilitates the movement onto ‘teaching-only’ contracts of some staff, but believe this will not be inhibited by continuing to allow HEIs to be selective in their choice of staff to return. We are particularly anxious to ensure REF requirements are not misused by HEI’s to inhibit the involvement of staff primarily engaged in practice teaching or research in the research culture and REF return of their institution.
Q12. 9a. The proposal to require an average of two outputs per full-time equivalent staff returned?
Our answer here collates views on the various aspects of question 12. We welcome the disconnect between numbers of outputs and individual staff members, We support the suggestion of requiring an average of 2 times the number of staff returned as the aggregate volume of outputs from an HEI. However it is important that the number required, based on this algorithm, calculates the staff number by including special circumstances such as part-time or early career researcher status, sickness or maternity leave, and similar. This would reinforce the need for universities to have an active E&D policy and practice and remove its intrusion into people’s career advancement and standing as researchers in their field. We wish to see a process based on inclusivity, fairness, and representativeness as principles. For each member of staff returned we wish to see a minimum of one and a maximum of four outputs required.
Q13. 9b. The maximum number of outputs for each staff member?
A maximum of four. We would accept 3 as a maximum, all other aspects of this issue being retained as recommended here.
Q14. 9c. Setting a minimum requirement of one for each staff member?
We support a minimum of one output on the basis that all returned staff should be included to some minimum level in the return
Q15. 10a. Is acceptance for publication a suitable marker to identify outputs that an institution can submit and how would this apply across different output types?
Normally verified acceptance for publication would be an acceptable indicator of an output’s certain arrival in the public domain. However, we are aware that many of our members produce research based work, for example in the form of films, videos, and other formats, whose ‘publication’ is less easily determined than for conventional monographs or journal articles. For that reason the verification of ‘publication’ should be flexible enough to allow for all such work.
Q16. 10b. What challenges would your institution face in verifying the eligibility of outputs?
N/A
Q17. 10c. Would non-portability have a negative impact on certain groups and how might this be mitigated?
We recognise that the aim of not allowing portability was to inhibit the ‘transfer market’ by which richer universities could recruit ready-made submissions by creating posts for staff with strong outputs. However, we are also very conscious of the unintended consequence of such a requirement that it could inhibit recruitment of early career researchers, whereas portability might mean they could look strong candidates if they had embryonic or promising output records and would be tariffed as last time, so that they could actually be valuable members of a return. On balance, therefore, we support portability, not least as work is that of its author, though with some readiness to accept some caution around this through any viable and credible mechanism which limited portability to outputs produced in the earlier period of the publication ‘window’. In the absence of any such mechanism we believe portability, despite its possible abuse, to be the proper basis for the relation between outputs and authors.
Q18. 10d. What comments do you have on sharing outputs proportionally across institutions?
We are not persuaded of the practicability of this.
Q19. 11. Do you support the introduction of a mandatory requirement for the Open Researcher and Contributor ID to be used as the staff identifier, in the event that information about individual staff members continues to be collected in REF 2021?
Yes
Q20. 12. What comments do you have on the proposal to remove Category C as a category of eligible staff?
No comment
Q21. 13. What comments do you have on the definition of research assistants?
We feel that RA’s should be defined as such by contract status and not by some less definable and imprecise relationship to the research. We would welcome any measure that might improve security of employment status for RA’s.
Q22. 14. What comments do you have on the proposal for staff on fractional contracts and is a minimum of 0.2 FTE appropriate?
While 0.2 is arbitrary we regard it as good as anything.
Page 6: Collaboration
Q23. 15. What are your comments in relation to better supporting collaboration between academia and organisations beyond higher education in REF 2021?
Such collaboration would be illustrated and submitted for assessment in impact case studies. We are less convinced about its utility elsewhere and its possible slippage into a definition of research excellence that regarded commercial or industrial use of research as a primary indicator of its quality. It is not the role of REF either to support or reward such collaboration.
Page 7: Outputs
Q24. 16. Do you agree with the proposal to allow the submission of a reserve output in cases where the publication of the preferred output will post-date the submission deadline?
Yes
Comments: We support any proposal enabling submissions to include a reserve output where the exclusion of the preferred output may be beyond the certain knowledge of the submitting UoA (this would include claims for double weighting, publication dates, and so on).
Q25. 17. What are your comments in relation to the assessment of interdisciplinary research in REF 2021?
Our field is an inherently interdisciplinary one, drawing on the humanities and social sciences variably. So much work, not only in our own field, is interdisciplinary that we have great doubts about the utility of this indication or category for panels. We therefore regard the important factor being the composition of the panel having the competence to assess interdisciplinary work, but recommend that the use of this indicator against individual outputs be removed as being of little or no use to either submitting HEI’s or assessing panels.
Q26. 18. Do you agree with the proposal for using quantitative data to inform the assessment of outputs, where considered appropriate for the discipline? If you agree, have you any suggestions for data that could be provided to the panels at output and aggregate level?
Yes
Comments: The important term here is ‘inform’. Bibliometric data was little used by sub-panels within Panel D in 2014, and at best could be used to inform, and to a very limited extent, be employed, in the assessment, but we do not believe it should go further than this, and would have little concern should panels choose not to employ such data. Page 8: Impact
Q27. 19. Do you agree with the proposal to maintain consistency where possible with the REF 2014 impact assessment process?
Yes
Comments: We particularly welcome the proposal to continue the intelligently broad understanding of impact as it was used in 2014, and the proposal not to increase its role in the proportionate attribution of overall quality.
Q28. 20. What comments do you have on the recommendation to broaden and deepen the definition of impact?
See q. 27. Impact can take many forms and the nature of impact assessed by the REF should be as flexible as possible to ensure all types of impact can be assessed.
Q29. 21. Do you agree with the proposal for the funding bodies and Research Councils UK to align their definition of academic and wider impact?
Yes
If yes, what comments do you have on the proposed definitions?
Our answer to this question is in reality ‘yes but…’. We do not wish here to comment on future research council policy. We welcome the broadening of the way in which impact is defined and construed, but would need further clarification of paras 78 and 79a before offering firm views. What, for example, is intended by the requirement for research to have impact ‘on teaching’ or by many of the terms (not least ‘demonstrable’) offered in para 79a?
Q30. 22. What comments do you have on the criteria of reach and significance?
In 2014 reach and significance were not well understood or defined in Guidance. Significance might be replaced since it is also a criterion used in assessing outputs. The geographical element of reach should be spelled out more since there is often understandable uncertainty as to whether and to what extent the criterion is more than simple spread of influence. In principle the two criteria are sensible dimensions of impact, but require much clearer definition and explanation.
Q31. 23. What do you think about having further guidance for public engagement impacts and what do you think would be helpful?
We believe that that for 2014 the distinction between dissemination and impact was not well understood, nor well explained. Public engagement is a vague term and the extent to which contributing to public knowledge and understanding is greater or less than ‘public engagement’ remains unclear.
Q32. 24. Do you agree with the proposal that impacts should remain eligible for submission by the institution or institutions in which the underpinning research has been conducted?
No
Comments: Impact is generated at various times and through various routes – it might be in the place where the research started but it may also be (and often is) as result of efforts by the institution the research(er) moves to. An assessment of impact should be credited to where the impact is achieved
Q33. 25. Do you agree that the approach to supporting and enabling impact should be captured as an explicit section of the environment element of the assessment?
Yes
Q34. 26. What comments do you have on the suggested approaches to determining the required number of case studies? Are there alternative approaches that merit consideration?
We broadly agree with the proposal. Unquestionably the ratio affected who was submitted and where, but this may be relieved by the output numbers requirement. We disagree with para 94 since impact work is largely undertaken at UoA level and the proposal could disadvantage small departments. Submissions can show how institutional level activity helps or is used, but institutional case studies detract attention from UoA level work and if privileged as proposed could severely disadvantage many UoA’s.
Q35. 27. Do you agree with the proposal to include a number of mandatory fields in the impact case study template to support the assessment and audit process better (paragraph 96)?
Yes
Comments: We agree with the provision of a template to give clarity to requirements, but suggest that this should have additional mandatory fields, e.g. how underpinning research was related to impact work (with a corollary explanation as to what is required).
Q36. 28. What comments do you have on the inclusion of further optional fields in the impact case study template?
see above
Q37. 29. What comments do you have in relation to the inclusion of examples of impact arising from research activity and bodies of work, as well as from specific research outputs?
We support the notion of impact arising from bodies of work as well as from individual outputs. This leaves unclear, however, as to how the ‘excellence’ of that research might be demonstrated. This may be a matter best left to the judgement of panels rather than prescription.
Q38. 30. Do you agree with the proposed timeframe for the underpinning research activity (1 January 2000 – 31 December 2020)?
Yes
Q39. 31. What are your views on the suggestion that the threshold criterion for underpinning research, research activity or a body of work should be based on standards of rigour? Do you have suggestions for how rigour could be assessed?
We are puzzled by and oppose the notion that rigour alone (excluding significance and originality) is the proposed criterion for underpinning research. As noted above we also regard it as almost impossible to insist on any demonstration of excellence as a threshold criterion and therefore regard assessment of impact as being best self-contained.
Q40. 32a. The suggestion to provide audit evidence to the panels?
We are sceptical about undue routinisation of evidence of this kind, which could have the unintended consequence of narrowing the range of types of impact submitted. The range of ICS’s (available online) demonstrates the very wide range of possible forms of impact, and nothing should be done to inhibit or unduly standardise or audit impact, so it should be left to panels to judge, although some guidance as to the types of evidence to be provided, as last time, would be welcome.
Q41. 32b. The development of guidelines for the use and standard of quantitative data as evidence for impact?
See above
Q42. 32c. Do you have any other comments on evidencing impacts in REF 2021?
See above
Q43. 33. What are your views on the issues and rules around submitting examples of impact in REF 2021 that were returned in REF 2014?
We regard continuity and consistency as virtues, and would thus welcome any such examples being regarded as eligible. In some, perhaps many, cases evidence of continuity and development would be very helpful to panels in assessing the nature and quality of the impact described. In other words we strongly recommend such ‘repetition’ be permitted as long as it can indeed describe development.
Page 9: Environment
Q44. 34a. Do you agree with the proposal to change the structure of the environment template by introducing more quantitative data into this aspect of the assessment?
No
Comments: As long as environment is clearly and primarily concerned with quantifiable elements such as doctoral students, research grants, facilities, library support and so on then we regard the 2014 structure and requirements as adequate, and would be disappointed if environment be made unduly quantified.
Q45. 34b. Do you have suggestions of data already held by institutions that would provide panels with a valuable insight into the research environment?
No Response
Q46. 35. Do you have any comment on the ways in which the environment element can give more recognition to universities’ collaboration beyond higher education?
The ‘research environment’ is understood by most to describe the culture and material support for research within the institution rather than its external links.
Q47. 36. Do you agree with the proposals for providing additional credit to units for open access?
No
Comments: We welcome the move to widen access to research outputs but would be wary of any element of research assessment that gave advantage to institutions solely because of their depth of resources or capacity to provide support to activities of this kind independent of any measure of quality.
Q48. 37. What comments do you have on ways to incentivise units to share and manage their research data more effectively ?
No Response
Page 10: Institutional level assessment
Q49. 38. What are your views on the introduction of institutional level assessment of impact and environment?
We are not in favour of institutional level assessment as it could disadvantage good UoAs in ‘weaker’ institutions and undervalue the work done at UoA level, while possibly ceding too much control over submissions outside the submitting UoA. We are certainly against idea of a separate panel for this.
Q50. 39. Do you have any comments on the factors that should be considered when piloting an institutional level assessment?
We do not regard the notion of a separate panel to assess institutional level activity as necessary or helpful.
Page 11: Outcomes and weighting
Q51. 40. What comments do you have on the proposed approach to creating the overall quality profile for each submission?
Agreed
Q52. 41. Given the proposal that the weighting for outputs remain at 65 per cent, do you agree that the overall weighting for impact should remain at 20 per cent?
Yes
Q53. 42. Do you agree with the proposed split of the weightings between the institutional and submission level elements of impact and environment?
No
Comments: See earlier comments. It is impossible, and probably unhelpful, to prejudge the split between institutional level and UoA level activity in both impact and environment, and a submission should not be required to distort its activity or submission to meet such a split if inappropriate to the focus of its work.
Page 12: Proposed timetable for REF 2021
Q54. 43. What comments do you have on the proposed timetable for REF 2021?
Broadly acceptable. However, we would support any move to publish Guidance as soon as practicable. Table 1 in the consultation says 2018 – this is somewhat vague and early or mid 2018 would be preferable.
Page 13: Other
Q55. 44. Are there proposals not referred to above, or captured in your response so far, that you feel should be considered? If so, what are they and what is the rationale for their inclusion?
No
Page 14: Contact details
Q56. If you would be happy to be contacted in the event of any follow-up questions, please provide a contact email address.
n.fenton@gold.ac.uk