National Evaluation of Specialty Selection (NESS)

This project evaluated a number of pilot selection processes and existing national selection processes for specialty trainees in the NHS. We collected data from over 4,000 applicants in the 2009 selection process to help us consider four key evaluation criteria: validity, reliability, acceptability and cost-effectiveness. 

  • Funder: Department of Health
  • Project timetable: August 2008 to June 2010

Research Team

University of Birmingham research team: 

Other members of the research team:

Professor Janet Grant (Open University)
Professor Stephen Field (Royal College of General Practitioners)
Dr Harry Gee (Birmingham Women's Hospital)
Dr Andrew Malins (Queen Elizabeth Hospital, Birmingham)
Dr Elizabeth Spencer (Gloucestershire Hospitals NHS Trust)

Project Summary

Introduction

Selecting the right doctors for speciality training is important for the doctors applying, those with whom they will work and for the patients who will be treated by them in the NHS. Recent changes to the way in which post-graduate medical training is structured (Modernising Medical Careers, MMC) enabled previous selection processes to be revised. The initial change, to a national application system, MTAS, was rolled-out in 2007 but was ill-received by applicants and other doctors and was also subject to serious security breaches. The initial response to these problems was a return to pre-MMC selection procedures.

The Department of Health is now keen to build up an evidence-base for best practice in the selection of specialty trainees and are subsequently funding a number of pilot schemes across regions, specialties and entry levels for training. These pilots will run alongside existing selection processes, so that the results from the pilots will not contribute to selection outcomes.

The aim of NESS was therefore to provide an objective evaluation of these pilots, synthesising the results of the individual pilots with an analysis of data from existing national selection processes. The evaluation wwas centred on four key criteria:

  1. Validity (selecting the right doctors for training)
  2. Reliability (consistency in selecting the right doctors for training) 
  3. Cost-effectiveness (value for money) 
  4. Acceptability (by applicants and the profession)

Methods

We applied a common methodological protocol across all 13 specialties involved to collect a wide range of data. For the larger nationally recruiting specialties, we focused on five deaneries: London, North West, Oxford, West Midlands and Yorkshire & Humberside.

Information about the pilots and existing selection processes (Context and cost-effectiveness)

Key personnel at each site were interviewed and an analysis of key documentation was
undertaken in order to obtain detailed information about the selection processes and their development, as well as costs data. Where feasible, selection processes (e.g. assessment centres) were observed by a member of the evaluation team.

Quantitative Data (Validity, reliability and cost-effectiveness)

Applicants’ scores at each stage of the selection process, personal characteristics and
subsequent performance in training were obtained from Deanery and Royal College
databases (subject to agreement).

Views about the pilots and existing selection processes (Acceptability)

Applicants were invited to complete up to three brief surveys to determine their views on the selection processes. Applicants were asked for consent to link their views to selection outcomes. Interviews and focus groups were undertaken with assessors and applicants.

Each pilot site conducted a local evaluation and it was necessary for them to obtain
informed consent from applicants prior to their inclusion in the pilot scheme and local
evaluation. This consent was extended to our national evaluation. Participation was voluntary.

Ethics and Confidentiality

All data collected in relation to the study, including data from secondary sources, was
managed in accordance with the strict protocols that exist within The University of Birmingham to ensure security and confidentiality. The project was registered on the University’s Data Protection System. All data was kept secure (in locked filing cabinets and passwordprotected computers in personal offices) and coding was used to ensure that no data was stored in a form that would allow individual participants to be identified. We were informed that this project was defined as service evaluation and does not require NHS ethical approval. Nevertheless, the research will adhere to the ethical guidelines of the British Educational Research Association (BERA) and was approved by The University of Birmingham ethics committee.

Dissemination

National Evaluation of Specialty Selection Conference

27 September 2010 Centre for Professional Development, Medical School, University of Birmingham

What can we learn from NESS?

  • Selection Processes in 13 Specialties: Similarities and Differences
  • Competencies and the Person Specification: on what basis are trainees selected?
  • How can we assess reliability?
  • How can we improve the organisation of Specialty Selection?
  • How much does Specialty Selection cost?
  • Risks, benefits and next steps

Ottawa 2010 Conference

Dr. Celia Taylor presented findings from the NESS project at the Ottawa 2010 Conference (Miami, 15-20 May 2010). The presentation focused on Cost-Effectiveness and was entitled Effectiveness and cost: how much should we spend on selection for specialty training? 

Presentation abstract (PDF 7KB, opens in new window)

You may download Version 1 of an Excel-based tool to help you estimate the costs of your selection process. Please note this file includes a Macro so you should allow Excel to run Macros when opening the file. The file will open on the ‘Instructions’ worksheet which will provide a brief introduction to the tool and how to use it.

Specialty Selection Costing model

Please do not cite without the author’s permission

We would welcome any feedback you have on this tool so that it can be improved for the future. Please contact Dr Celia Taylor c.a.taylor@bham.ac.uk ; +00 44 121 414 9072) with any comments.  

The National Evaluation of Specialty Selection Dissemination and Consultation Conference 2009

The National Evaluation of Specialty Selection Dissemination and Consultation Conference took place on 11 November 2009 at the Centre for Professional Development in the Medical School, University of Birmingham.

Over 60 delegates attended the Conference. The Conference was not only an opportunity to disseminate some of the findings from data collected for the project, but also drew on the expertise and experience of delegates in interactive ‘breakout’ sessions.

PDF files of the presentations given during the conference are now available below. 

Introduction ( PDF 189KB, opens in new window)

Acceptability (PDF 534KB, opens in new window)

Validity (PDF 353KB, opens in new window)

Reliability (PDF 272KB, opens in new window)

Cost-Effectiveness (PDF 189KB, opens in new window)

Please note that these presentations or any extracts from them are not to be modified, reproduced or distributed without prior written permission from Professor Hywel Thomas, Principal Investigator of the NESS project. As we are eager to undertake extensive dissemination, please be assured that such assent will not be withheld unreasonably.

Contact Details

For more information about the project, or to add your details to our contacts database (to receive project updates), please contact:

Celia Taylor
Email: c.a.brown@bham.ac.uk 
Tel: 0121 414 9072