Product has been added to the basket

BIS Select Committee - Assessing the quality of Higher Education

The Business, Innovation and Skills Select Committee has launched an inquiry into 'Assessing the quality of Higher Education'. Details of the inquiry can be found on the committee website. The submission produced jointly by the Geological Society and University Geoscience UK can be found below:

Submitted 29 October 2015

1. This submission has been produced jointly by the Geological Society of London and University Geoscience UK.

i. The Geological Society (GSL) is the UK’s learned and professional body for geoscience, with about 12,000 Fellows (members) worldwide. The Fellowship encompasses those working in industry, academia and government with a broad range of perspectives on policy-relevant science, and the Society is a leading communicator of this science to government bodies, those in education, and other non-technical audiences.

ii. University Geoscience UK is the subject association of Geoscience (geology, applied geology, Earth science, geophysics, geochemistry and some environmental science) departments/schools based within universities in the British Isles. It promotes discussion and exchange of information between departments and provides a point of contact between these and professional, government and quality control agencies.

What issues with quality assessment in Higher Education was the Higher Education Funding Council for England's (HEFCE) Quality Assurance review seeking to address?

2. In our view, key issues need to be addressed against the changing landscape of HE with respect to i) quality assessment (QA) processes needing to cope with increasing diversity in the manner of higher education delivery, ii) increasing inter-disciplinarity of subjects, iii) differences in funding/degree of autonomy over student recruitment of academic institutes in England vs Wales vs Scotland, and iv) the need for QA of courses run overseas or as distance-learning courses, but with UK institutes as the degree-awarding bodies. In this light, we identify several enduring perceptions surrounding QA which are of particular significance to our members:

  • A perceived lack of rigorous quality assurance mechanisms in the current subject reviews/external examining procedures (a perception with which we largely disagree)
  • A perception that the subject review process is both too cumbersome and lacking in rigour (a perception with which we largely agree)
  • A perception that the external examining system is ‘too cosy’ to be an effective quality assurance mechanism (a perception with which we again largely disagree)

Will the proposed changes to the quality assurance process in universities, as outlined by HEFCE in its consultation, improve quality in Higher Education?

3. In our view, this will depend entirely on the rigour with which proposals are implemented. The “risk-based” approach to monitoring of standards and the student experience is welcome, in avoiding the undue and costly time demands of a heavier-touch approach. We understand the motivation for measures proposed to enhance training and equivalence of standards in the role of external examiners. This might reasonably be expected to improve outcomes and confidence in the quality assurance process. However, attention should also be paid to the potential impact a more demanding programme for external examiner work may have on the eligibility and willingness of research-active scientists to participate. External examining is already a significant time commitment for academics and increasing expectations may dissuade high calibre researchers from taking part. This may offset the expected benefits. Thought should also be given to how the potential increased costs of implementing an enhanced external examination system and remunerating examiners would be met.

4. There will also need to be robust mechanisms by which, on the one hand, external examiners can raise concerns when a particular deficit is repeatedly not addressed (from one year to the next) as well as, on the other hand, a clear appeals process by which a department may dispute the opinion of an individual external examiner. If, after such a process, an issue had not been addressed appropriately then a flag (or amber traffic light?) might be triggered, requiring a ramping up in the evaluation of academic standards or student experience for the programme concerned.

5. The Higher Education Funding Council for England's (HEFCE) Quality Assurance Review explores the possibility of professional, statutory and regulatory bodies (PSRBs), which we take to include professional and learned societies such as the Geological Society, being more involved in the setting and monitoring of standards, as well as the development and delivery of the curriculum. Geological Society accreditation procedures do not, currently, include direct assessment and monitoring of standards. To do so would presumably require it to co-opt members of the academic community, in which case it is not clear how this would be materially different from higher education departments using academic external examiners in a more traditional manner. The review might therefore consider whether PSRBs could usefully play a role within the current external examining system as a means of ensuring that standards (as well as the curriculum, as at present) are maintained. The Geological Society does not seek an enhanced role of this kind, but would be pleased to discuss what part it and kindred organisations might play in any revised external examining system.

6. It is our view that over-reliance on metrics will not in itself improve quality, but will force universities and departments into actions which affect metrics rather than quality. In addition to our comments relating to the potential increased role of external examiners and PSRBs, we identify several aspects within the current system that could be enhanced, but that may be at odds with an approach that is centred on metrics:

  • Overall satisfaction is not necessarily a meaningful measure of quality – universities must challenge students academically, and take them out of their comfort zones.
  • Geoscience is a highly diversified, and rapidly changing, area of study, and it is important that an external examiner system does not impose artificial external constraints which interfere with the freedom to evolve and to develop course content in line with academic, scientific and societal requirements. The Geological Society’s degree accreditation scheme ensures that the needs of industry are recognised in such evolution.
  • There should be much greater awareness, and advertisement, of the external examiner system among students, prospective students, and the general public to emphasise the high standards achieved and the improvements to the student experience within individual subjects.
  • PSRBs, in consultation with organisations such as University Geoscience UK, have an important role in quality assurance processes, data management, verification and dissemination and in benchmarking

What should be the objectives of a Teaching Excellence Framework ('TEF')?
a. How should a TEF benefit students? Academics? Universities?

7. The concept of a Teaching Excellence Framework, as outlined in ministerial statements and public consultation documents, remains ill-defined. The notion that teaching excellence can be quantified through greater use of metrics, as has been suggested, is fraught with dangers. Numerous studies by the Higher Education Academy (HEA) and others have highlighted that teaching excellence has a wide range of attributes that would need to be considered, such as the educational standards of the relevant intake and the educational and student experience outcomes of students at the end of their degree programmes, set against student expectations, possible restrictions imposed by professional, statutory and regulatory bodies (PSRBs) on curriculum choice or innovation, and variations between pure versus applied and more employment-oriented curricula, to name but a few. Data are available regarding universities’ performance on issues such as retention, employment outcomes, research output and widening participation (for example through the Higher Education Statistics Agency, HESA), although the interpretation of these varies considerably from subject to subject. Similarly, aspects of student satisfaction are evaluated through the National Student Survey (NSS). But care should be given to ensuring that crude academic outcome statistics (grade point averages (GPA) by programme, for example) or graduate incomes are not used in a way that would distort academic behaviour. This might occur as a result of pressure on departments to produce higher GPA outcomes year on year, to degrade academic rigour to achieve the same end or to persuade students into particular better-paid career paths. The recent history of GCSE and A-level grade inflation prior to 2015 provides a salutary example of the impact of such pressures.

In short, we envisage the principal objectives of a TEF to be:

  • To validate standards of HE provision in the geosciences
  • To identify inconsistencies and drive up standards
  • To identify and share good practice

b. What are the institutional behaviours a TEF should drive? How can a system be designed to avoid unintended consequences?

8. All universities currently have established quality assurance procedures in place. These impact on course design, content and teaching and learning practices via engagement with PSRBs, external examiners and wider groups of experts, including those from industry. As highlighted previously, grade inflation is a potential risk in a metric-driven system, although this could be counterbalanced by placing increased emphasis on the external examiner role. There is, however, a considerable risk that extensive revision of the external examining system, inevitably requiring much increased time commitment, will discourage the participation of research-active staff, to the detriment of valuable research-led teaching approaches, and exacerbate the differentiation between research- and teaching-track staff.

9. Data quality is a major issue. Whether data provided for a metrics-based approach accurately reflect subsets of the subjects at programme level is questionable. Currently data compiled by HESA does not relate directly to geoscience degrees as these programmes are often linked with other subjects (e.g. oceanography, environmental science). Granularity of data provision at programme level would be needed to ensure that genuine areas of teaching excellence can be distinguished from low quality programmes.

c. How should the effectiveness of the TEF be judged?

10. In our view, effectiveness should be judged against the identified objectives, namely:
  •  To validate standards of HE provision in the geosciences
  • To identify inconsistencies and drive up standards
  • To identify and share good practice
with external examiners and appropriate PSRBs having a central role in evaluating performance in each area.

How should the proposed TEF and new quality assurance regime fit together?

11. If a TEF is necessary or useful (and our members are currently unconvinced by the case outlined thus far) it should surely be designed to reflect HEFCE’s regulatory quality assurance scheme. Any duplication of quality assessment activity would be time-consuming (and therefore detrimental to teaching excellence or parallel research excellence) and costly. The two must be closely integrated and not metric-driven.

What do you think will be the main challenges in implementing a TEF?

12. With such limited information on how a TEF might be constructed and implemented, this question is simply premature. We might, however, anticipate several challenges:

  • Identifying metrics which meaningfully capture teaching excellence.
  • Achieving genuine assessment of teaching excellence beyond metrics.
  • Ensuring comparability and compatibility between TEF and REF at an institutional level.
  • Developing a suitable professional accreditation/qualification framework for external examiners, without limiting diversity in the pool of external examiners. It is certainly a recognised strength of the current system that external examiners are drawn from both the research- and teaching-intensive Universities as well as from industry.
  • Developing CPD programmes for external examiners in partnership with PSRBs (that may include the Geological Society).
  • Ensuring that individual external examiners, or groups of examiners, do not have undue influence.

How should the proposed connection between fee level and teaching quality be managed?
a. What should be the relationship between the TEF and fee level?
b. What are the benefits or risks of this approach to setting fees?

13. The application of financial penalties in cases of a programme or institution failing to maintain appropriate academic standards or quality would be likely to make it more difficult to re-establish the teaching quality that is aspired to. If a department or institution, through a risk-based QA process, were found to be wanting over a period of time and despite having been given the opportunity to address identified shortcomings, then a more robust approach would simply be for degree-awarding status in the particular subject area(s) to be suspended until it were established that a teaching team were in place that could reasonably be expected to achieve and maintain (through a probationary period and beyond) the requisite curriculum and student experience.

14. Many of our members may welcome a loosening of the £9000 cap on fees, and the opportunity that might present, given a suitable set of TEF metrics. League tables, with TEF metrics included, would undoubtedly inform student choice and, to a certain extent, drive future demand. But this implies the creation of a new marketplace in which institutions may chase metrics rather than focus on the student experience and teaching quality (the claimed purpose of the exercise). A more useful approach may be to evaluate the need for fee increases on a subject-by-subject costed basis.